Feb 01 06:47:28 crc systemd[1]: Starting Kubernetes Kubelet... Feb 01 06:47:28 crc restorecon[4815]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:28 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 01 06:47:29 crc restorecon[4815]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 01 06:47:29 crc kubenswrapper[5127]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:47:29 crc kubenswrapper[5127]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 01 06:47:29 crc kubenswrapper[5127]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:47:29 crc kubenswrapper[5127]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:47:29 crc kubenswrapper[5127]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 01 06:47:29 crc kubenswrapper[5127]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.968490 5127 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972773 5127 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972801 5127 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972809 5127 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972816 5127 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972822 5127 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972828 5127 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972834 5127 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972841 5127 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972847 5127 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972854 5127 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972860 5127 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972865 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972879 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972885 5127 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972890 5127 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972896 5127 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972900 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972906 5127 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972911 5127 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972915 5127 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972920 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972925 5127 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972930 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972935 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972940 5127 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972945 5127 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972950 5127 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972957 5127 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972963 5127 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972969 5127 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972974 5127 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972980 5127 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972986 5127 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972991 5127 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.972996 5127 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973001 5127 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973006 5127 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973011 5127 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973016 5127 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973021 5127 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973026 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973032 5127 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973038 5127 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973044 5127 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973049 5127 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973054 5127 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973059 5127 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973063 5127 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973068 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973073 5127 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973078 5127 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973083 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973088 5127 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973092 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973101 5127 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973107 5127 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973112 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973117 5127 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973121 5127 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973126 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973131 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973136 5127 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973141 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973145 5127 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973150 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973155 5127 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973159 5127 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973164 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973169 5127 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973174 5127 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.973179 5127 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973309 5127 flags.go:64] FLAG: --address="0.0.0.0" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973322 5127 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973335 5127 flags.go:64] FLAG: --anonymous-auth="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973345 5127 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973353 5127 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973360 5127 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973369 5127 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973376 5127 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973383 5127 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973389 5127 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973395 5127 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973402 5127 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973410 5127 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973417 5127 flags.go:64] FLAG: --cgroup-root="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973424 5127 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973431 5127 flags.go:64] FLAG: --client-ca-file="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973438 5127 flags.go:64] FLAG: --cloud-config="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973444 5127 flags.go:64] FLAG: --cloud-provider="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973451 5127 flags.go:64] FLAG: --cluster-dns="[]" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973460 5127 flags.go:64] FLAG: --cluster-domain="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973467 5127 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973473 5127 flags.go:64] FLAG: --config-dir="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973481 5127 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973489 5127 flags.go:64] FLAG: --container-log-max-files="5" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973499 5127 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973506 5127 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973512 5127 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973518 5127 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973524 5127 flags.go:64] FLAG: --contention-profiling="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973531 5127 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973538 5127 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973546 5127 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973554 5127 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973564 5127 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973574 5127 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973607 5127 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973613 5127 flags.go:64] FLAG: --enable-load-reader="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973621 5127 flags.go:64] FLAG: --enable-server="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973628 5127 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973643 5127 flags.go:64] FLAG: --event-burst="100" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973651 5127 flags.go:64] FLAG: --event-qps="50" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973659 5127 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973667 5127 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973675 5127 flags.go:64] FLAG: --eviction-hard="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973685 5127 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973693 5127 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973700 5127 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973708 5127 flags.go:64] FLAG: --eviction-soft="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973716 5127 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973723 5127 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973731 5127 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973739 5127 flags.go:64] FLAG: --experimental-mounter-path="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973746 5127 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973754 5127 flags.go:64] FLAG: --fail-swap-on="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973761 5127 flags.go:64] FLAG: --feature-gates="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973771 5127 flags.go:64] FLAG: --file-check-frequency="20s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973779 5127 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973787 5127 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973794 5127 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973800 5127 flags.go:64] FLAG: --healthz-port="10248" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973808 5127 flags.go:64] FLAG: --help="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973815 5127 flags.go:64] FLAG: --hostname-override="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973823 5127 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973831 5127 flags.go:64] FLAG: --http-check-frequency="20s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973839 5127 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973847 5127 flags.go:64] FLAG: --image-credential-provider-config="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973856 5127 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973864 5127 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973873 5127 flags.go:64] FLAG: --image-service-endpoint="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973881 5127 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973888 5127 flags.go:64] FLAG: --kube-api-burst="100" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973896 5127 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973905 5127 flags.go:64] FLAG: --kube-api-qps="50" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973912 5127 flags.go:64] FLAG: --kube-reserved="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973919 5127 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973927 5127 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973935 5127 flags.go:64] FLAG: --kubelet-cgroups="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973943 5127 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973951 5127 flags.go:64] FLAG: --lock-file="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973959 5127 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973967 5127 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973974 5127 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973987 5127 flags.go:64] FLAG: --log-json-split-stream="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.973994 5127 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974001 5127 flags.go:64] FLAG: --log-text-split-stream="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974008 5127 flags.go:64] FLAG: --logging-format="text" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974016 5127 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974024 5127 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974031 5127 flags.go:64] FLAG: --manifest-url="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974039 5127 flags.go:64] FLAG: --manifest-url-header="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974049 5127 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974057 5127 flags.go:64] FLAG: --max-open-files="1000000" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974066 5127 flags.go:64] FLAG: --max-pods="110" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974074 5127 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974083 5127 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974091 5127 flags.go:64] FLAG: --memory-manager-policy="None" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974098 5127 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974106 5127 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974113 5127 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974121 5127 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974140 5127 flags.go:64] FLAG: --node-status-max-images="50" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974148 5127 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974155 5127 flags.go:64] FLAG: --oom-score-adj="-999" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974162 5127 flags.go:64] FLAG: --pod-cidr="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974171 5127 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974183 5127 flags.go:64] FLAG: --pod-manifest-path="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974190 5127 flags.go:64] FLAG: --pod-max-pids="-1" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974198 5127 flags.go:64] FLAG: --pods-per-core="0" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974205 5127 flags.go:64] FLAG: --port="10250" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974212 5127 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974218 5127 flags.go:64] FLAG: --provider-id="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974223 5127 flags.go:64] FLAG: --qos-reserved="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974229 5127 flags.go:64] FLAG: --read-only-port="10255" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974235 5127 flags.go:64] FLAG: --register-node="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974241 5127 flags.go:64] FLAG: --register-schedulable="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974247 5127 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974262 5127 flags.go:64] FLAG: --registry-burst="10" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974267 5127 flags.go:64] FLAG: --registry-qps="5" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974273 5127 flags.go:64] FLAG: --reserved-cpus="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974279 5127 flags.go:64] FLAG: --reserved-memory="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974286 5127 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974292 5127 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974298 5127 flags.go:64] FLAG: --rotate-certificates="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974304 5127 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974310 5127 flags.go:64] FLAG: --runonce="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974316 5127 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974322 5127 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974329 5127 flags.go:64] FLAG: --seccomp-default="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974335 5127 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974341 5127 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974348 5127 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974355 5127 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974362 5127 flags.go:64] FLAG: --storage-driver-password="root" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974369 5127 flags.go:64] FLAG: --storage-driver-secure="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974376 5127 flags.go:64] FLAG: --storage-driver-table="stats" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974383 5127 flags.go:64] FLAG: --storage-driver-user="root" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974389 5127 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974397 5127 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974404 5127 flags.go:64] FLAG: --system-cgroups="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974411 5127 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974424 5127 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974431 5127 flags.go:64] FLAG: --tls-cert-file="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974438 5127 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974447 5127 flags.go:64] FLAG: --tls-min-version="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974455 5127 flags.go:64] FLAG: --tls-private-key-file="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974462 5127 flags.go:64] FLAG: --topology-manager-policy="none" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974469 5127 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974477 5127 flags.go:64] FLAG: --topology-manager-scope="container" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974484 5127 flags.go:64] FLAG: --v="2" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974494 5127 flags.go:64] FLAG: --version="false" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974503 5127 flags.go:64] FLAG: --vmodule="" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974511 5127 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.974519 5127 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974762 5127 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974777 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974785 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974792 5127 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974798 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974806 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974813 5127 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974821 5127 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974827 5127 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974834 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974842 5127 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974849 5127 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974855 5127 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974861 5127 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974868 5127 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974874 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974880 5127 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974886 5127 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974895 5127 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974903 5127 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974910 5127 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974917 5127 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974924 5127 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974934 5127 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974941 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974947 5127 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974954 5127 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974961 5127 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974968 5127 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974976 5127 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974983 5127 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974989 5127 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.974996 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975003 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975010 5127 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975017 5127 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975023 5127 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975029 5127 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975036 5127 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975043 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975050 5127 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975057 5127 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975064 5127 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975074 5127 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975082 5127 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975089 5127 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975097 5127 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975103 5127 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975110 5127 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975118 5127 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975125 5127 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975131 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975137 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975145 5127 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975152 5127 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975158 5127 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975164 5127 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975171 5127 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975177 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975185 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975191 5127 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975196 5127 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975201 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975206 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975211 5127 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975216 5127 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975221 5127 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975226 5127 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975231 5127 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975236 5127 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.975241 5127 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.975250 5127 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.987864 5127 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.987915 5127 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988058 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988083 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988092 5127 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988100 5127 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988110 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988118 5127 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988127 5127 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988134 5127 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988144 5127 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988152 5127 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988160 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988167 5127 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988175 5127 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988183 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988192 5127 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988200 5127 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988208 5127 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988216 5127 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988224 5127 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988232 5127 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988240 5127 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988248 5127 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988256 5127 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988264 5127 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988271 5127 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988279 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988287 5127 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988295 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988303 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988311 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988320 5127 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988328 5127 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988340 5127 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988353 5127 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988362 5127 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988371 5127 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988380 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988388 5127 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988398 5127 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988405 5127 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988414 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988422 5127 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988431 5127 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988439 5127 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988448 5127 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988456 5127 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988464 5127 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988473 5127 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988482 5127 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988491 5127 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988501 5127 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988510 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988519 5127 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988529 5127 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988538 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988547 5127 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988556 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988565 5127 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988577 5127 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988616 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988625 5127 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988637 5127 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988649 5127 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988660 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988670 5127 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988679 5127 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988689 5127 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988700 5127 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988709 5127 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988720 5127 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988731 5127 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.988745 5127 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988958 5127 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988971 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988981 5127 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988989 5127 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.988997 5127 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989005 5127 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989013 5127 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989021 5127 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989029 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989037 5127 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989045 5127 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989053 5127 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989061 5127 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989068 5127 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989077 5127 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989087 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989095 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989103 5127 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989114 5127 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989123 5127 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989132 5127 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989139 5127 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989148 5127 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989155 5127 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989163 5127 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989171 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989178 5127 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989186 5127 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989194 5127 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989201 5127 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989209 5127 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989217 5127 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989224 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989234 5127 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989241 5127 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989249 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989256 5127 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989264 5127 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989272 5127 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989280 5127 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989287 5127 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989295 5127 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989302 5127 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989313 5127 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989323 5127 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989331 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989341 5127 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989351 5127 feature_gate.go:330] unrecognized feature gate: Example Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989362 5127 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989371 5127 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989380 5127 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989388 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989395 5127 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989404 5127 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989412 5127 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989419 5127 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989427 5127 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989435 5127 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989443 5127 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989451 5127 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989459 5127 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989467 5127 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989475 5127 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989483 5127 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989494 5127 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989504 5127 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989512 5127 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989520 5127 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989528 5127 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989537 5127 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 01 06:47:29 crc kubenswrapper[5127]: W0201 06:47:29.989546 5127 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.989559 5127 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.990808 5127 server.go:940] "Client rotation is on, will bootstrap in background" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.997422 5127 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 01 06:47:29 crc kubenswrapper[5127]: I0201 06:47:29.997567 5127 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.000196 5127 server.go:997] "Starting client certificate rotation" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.000249 5127 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.002561 5127 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-16 19:39:43.831966581 +0000 UTC Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.002723 5127 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.027045 5127 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.030490 5127 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.030967 5127 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.046845 5127 log.go:25] "Validated CRI v1 runtime API" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.087358 5127 log.go:25] "Validated CRI v1 image API" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.089880 5127 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.097335 5127 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-01-06-38-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.097382 5127 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.124036 5127 manager.go:217] Machine: {Timestamp:2026-02-01 06:47:30.121456904 +0000 UTC m=+0.607359347 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ebe07c8f-9946-4616-a1da-f5bf2315344d BootID:98b48bbb-d7fc-478c-b553-b66324236dfc Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:1c:66:9b Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:1c:66:9b Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:39:49:6a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:48:9a:de Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ae:86:da Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:49:d8:53 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:38:d7:54 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:dc:e3:c0 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:12:40:e6:de:0e:96 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5e:a4:ba:98:38:2b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.124448 5127 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.124793 5127 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.129001 5127 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.129362 5127 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.129416 5127 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.130433 5127 topology_manager.go:138] "Creating topology manager with none policy" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.130466 5127 container_manager_linux.go:303] "Creating device plugin manager" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.131556 5127 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.131629 5127 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.131942 5127 state_mem.go:36] "Initialized new in-memory state store" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.132079 5127 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.136049 5127 kubelet.go:418] "Attempting to sync node with API server" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.136086 5127 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.136116 5127 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.136138 5127 kubelet.go:324] "Adding apiserver pod source" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.136161 5127 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.147286 5127 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.148187 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.149074 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.148720 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.149132 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.151323 5127 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.153213 5127 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155178 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155235 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155258 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155279 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155308 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155327 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155346 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155369 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155386 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155401 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155419 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.155434 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.157865 5127 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.158575 5127 server.go:1280] "Started kubelet" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.158936 5127 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.158925 5127 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.159970 5127 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 01 06:47:30 crc systemd[1]: Started Kubernetes Kubelet. Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.161469 5127 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.162265 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.162319 5127 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.162543 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 07:51:30.892900153 +0000 UTC Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.162678 5127 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.163129 5127 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.163169 5127 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.163342 5127 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.164212 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.164355 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.164169 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.165907 5127 server.go:460] "Adding debug handlers to kubelet server" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.169078 5127 factory.go:55] Registering systemd factory Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.169131 5127 factory.go:221] Registration of the systemd container factory successfully Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.170134 5127 factory.go:153] Registering CRI-O factory Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.170185 5127 factory.go:221] Registration of the crio container factory successfully Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.170304 5127 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.170351 5127 factory.go:103] Registering Raw factory Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.170386 5127 manager.go:1196] Started watching for new ooms in manager Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.171503 5127 manager.go:319] Starting recovery of all containers Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.167709 5127 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18900c8c458e7b73 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:47:30.158525299 +0000 UTC m=+0.644427692,LastTimestamp:2026-02-01 06:47:30.158525299 +0000 UTC m=+0.644427692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190033 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190169 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190196 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190218 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190240 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190262 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190283 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190306 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190332 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190360 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190387 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190416 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190443 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190475 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190512 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.190538 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194046 5127 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194104 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194132 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194155 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194175 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194197 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194216 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194234 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194253 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194273 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194293 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194320 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194348 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194368 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194388 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194409 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194477 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194498 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194524 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194542 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194559 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194658 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194682 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194702 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194723 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194743 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194764 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194802 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194822 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194843 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194863 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194884 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194941 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194962 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.194982 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195001 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195021 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195047 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195069 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195091 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195113 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195133 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195152 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195172 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195195 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195213 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195233 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195253 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195272 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195295 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195315 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195335 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195354 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195374 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195396 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195418 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195438 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195457 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195480 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195499 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195520 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195543 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195564 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195641 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195694 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195719 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195740 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195761 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195781 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195801 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.195821 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196024 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196043 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196064 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196086 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196106 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196126 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196147 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196168 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196188 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196212 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196234 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196256 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196275 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196295 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196316 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196337 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196356 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196375 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196402 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196425 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196445 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196465 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196487 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196508 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196527 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196548 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196570 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196629 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196657 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196678 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196699 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196720 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196739 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196760 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196781 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196800 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196819 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196841 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196860 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196879 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196899 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196926 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196948 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196970 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.196990 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197009 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197067 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197087 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197107 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197127 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197149 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197169 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197188 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197210 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197232 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197252 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197272 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197292 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197313 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197332 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197390 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197418 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197441 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197470 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197494 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197519 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197542 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197564 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197620 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197650 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197673 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197700 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197724 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197753 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197777 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197800 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197825 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197850 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197872 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197895 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197918 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197941 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197965 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.197989 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198011 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198035 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198058 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198083 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198110 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198135 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198161 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198191 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198218 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198245 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198273 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198299 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198367 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198393 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198419 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198481 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198509 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198537 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198570 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198662 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198694 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198723 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198753 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198777 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198802 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198829 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198865 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198892 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198915 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198938 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198961 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.198982 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.199007 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.199029 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.199051 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.199073 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.199095 5127 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.199119 5127 reconstruct.go:97] "Volume reconstruction finished" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.199135 5127 reconciler.go:26] "Reconciler: start to sync state" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.211281 5127 manager.go:324] Recovery completed Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.223759 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.225568 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.225681 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.225705 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.226813 5127 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.226975 5127 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.227123 5127 state_mem.go:36] "Initialized new in-memory state store" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.231179 5127 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.234159 5127 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.234226 5127 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.234268 5127 kubelet.go:2335] "Starting kubelet main sync loop" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.234359 5127 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.235973 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.236041 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.239562 5127 policy_none.go:49] "None policy: Start" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.241632 5127 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.241694 5127 state_mem.go:35] "Initializing new in-memory state store" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.263492 5127 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.302016 5127 manager.go:334] "Starting Device Plugin manager" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.302133 5127 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.302154 5127 server.go:79] "Starting device plugin registration server" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.302778 5127 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.302802 5127 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.303085 5127 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.303196 5127 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.303207 5127 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.317780 5127 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.335103 5127 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.335327 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.337527 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.337626 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.337653 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.337976 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.338439 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.338527 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.339360 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.339411 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.339432 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.339937 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.340145 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.340215 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.340232 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.341033 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.341115 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.341716 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.341775 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.341798 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.342024 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.342176 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.342233 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.342318 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.342338 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.342348 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.343491 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.343519 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.343529 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.344778 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.344835 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.344856 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.347259 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.347369 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.347450 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.349874 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.349922 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.349934 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.350127 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.350157 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.350514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.350552 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.350570 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.352339 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.352379 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.352405 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.364904 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402223 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402281 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402316 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402344 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402375 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402403 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402428 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402456 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402479 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402509 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402537 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402563 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402668 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402721 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402848 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.402919 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.403933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.403983 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.404002 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.404038 5127 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.404526 5127 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.504156 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.504660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.504880 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.504715 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.504436 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.504937 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505098 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505276 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505308 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505373 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505402 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505430 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505444 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505484 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505456 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505523 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505535 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505551 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505555 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505623 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505626 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505651 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505658 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505673 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505694 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505714 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505738 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505790 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.505851 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.513796 5127 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18900c8c458e7b73 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:47:30.158525299 +0000 UTC m=+0.644427692,LastTimestamp:2026-02-01 06:47:30.158525299 +0000 UTC m=+0.644427692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.605323 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.607009 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.607076 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.607093 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.607134 5127 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.608241 5127 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.665129 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.697059 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.717544 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fcc20ddaf5918961e0109f2055148386904caea114a12df25470716914687c65 WatchSource:0}: Error finding container fcc20ddaf5918961e0109f2055148386904caea114a12df25470716914687c65: Status 404 returned error can't find the container with id fcc20ddaf5918961e0109f2055148386904caea114a12df25470716914687c65 Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.726950 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.735118 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4d8c2ed34864b96056ce402b7c08473ba37480b396e82594192addc79e1a7b48 WatchSource:0}: Error finding container 4d8c2ed34864b96056ce402b7c08473ba37480b396e82594192addc79e1a7b48: Status 404 returned error can't find the container with id 4d8c2ed34864b96056ce402b7c08473ba37480b396e82594192addc79e1a7b48 Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.749391 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1d392406840d3360b6a86adc00bc282e354b0efabb557e879b2bbc6d5c01daa7 WatchSource:0}: Error finding container 1d392406840d3360b6a86adc00bc282e354b0efabb557e879b2bbc6d5c01daa7: Status 404 returned error can't find the container with id 1d392406840d3360b6a86adc00bc282e354b0efabb557e879b2bbc6d5c01daa7 Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.751601 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: I0201 06:47:30.763796 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:30 crc kubenswrapper[5127]: E0201 06:47:30.766145 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.770709 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-25f33b419a9c1f7130e85ff7ac92be2186c52d58f059c5f0e856d8dc899d519f WatchSource:0}: Error finding container 25f33b419a9c1f7130e85ff7ac92be2186c52d58f059c5f0e856d8dc899d519f: Status 404 returned error can't find the container with id 25f33b419a9c1f7130e85ff7ac92be2186c52d58f059c5f0e856d8dc899d519f Feb 01 06:47:30 crc kubenswrapper[5127]: W0201 06:47:30.783340 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-6621b978a21e5bd770863493edbf1b1bf02fbf7f3ea814944a7bbef92fa6e91b WatchSource:0}: Error finding container 6621b978a21e5bd770863493edbf1b1bf02fbf7f3ea814944a7bbef92fa6e91b: Status 404 returned error can't find the container with id 6621b978a21e5bd770863493edbf1b1bf02fbf7f3ea814944a7bbef92fa6e91b Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.009334 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.011091 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.011122 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.011158 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.011186 5127 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:47:31 crc kubenswrapper[5127]: E0201 06:47:31.011513 5127 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.162819 5127 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.162909 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 22:38:30.802744186 +0000 UTC Feb 01 06:47:31 crc kubenswrapper[5127]: W0201 06:47:31.185596 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:31 crc kubenswrapper[5127]: E0201 06:47:31.185697 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.239265 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"25f33b419a9c1f7130e85ff7ac92be2186c52d58f059c5f0e856d8dc899d519f"} Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.240517 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d392406840d3360b6a86adc00bc282e354b0efabb557e879b2bbc6d5c01daa7"} Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.243508 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4d8c2ed34864b96056ce402b7c08473ba37480b396e82594192addc79e1a7b48"} Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.244842 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fcc20ddaf5918961e0109f2055148386904caea114a12df25470716914687c65"} Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.246558 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6621b978a21e5bd770863493edbf1b1bf02fbf7f3ea814944a7bbef92fa6e91b"} Feb 01 06:47:31 crc kubenswrapper[5127]: W0201 06:47:31.402423 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:31 crc kubenswrapper[5127]: E0201 06:47:31.402540 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:31 crc kubenswrapper[5127]: W0201 06:47:31.430970 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:31 crc kubenswrapper[5127]: E0201 06:47:31.431047 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:31 crc kubenswrapper[5127]: W0201 06:47:31.490314 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:31 crc kubenswrapper[5127]: E0201 06:47:31.490399 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:31 crc kubenswrapper[5127]: E0201 06:47:31.567140 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.812172 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.814336 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.814397 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.814415 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:31 crc kubenswrapper[5127]: I0201 06:47:31.814453 5127 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:47:31 crc kubenswrapper[5127]: E0201 06:47:31.815190 5127 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.069984 5127 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 01 06:47:32 crc kubenswrapper[5127]: E0201 06:47:32.071767 5127 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.162725 5127 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.163678 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 11:32:55.40868401 +0000 UTC Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.252817 5127 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98" exitCode=0 Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.252938 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.252961 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.254785 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.254853 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.254880 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.257501 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.257539 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.257549 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.257577 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.257641 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.260002 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.260042 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.260058 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.260117 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.260016 5127 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8" exitCode=0 Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.260254 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.261274 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.261316 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.261338 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.264950 5127 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d783b81865d0ee1c58cd84cfc82b4e54f5eeb4c6390299590da86b1a1c5f7b21" exitCode=0 Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.265192 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d783b81865d0ee1c58cd84cfc82b4e54f5eeb4c6390299590da86b1a1c5f7b21"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.266481 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.268366 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.269348 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.269436 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.269456 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.272940 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.272992 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.273012 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.273877 5127 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179" exitCode=0 Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.273971 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179"} Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.274312 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.282116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.282192 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:32 crc kubenswrapper[5127]: I0201 06:47:32.282210 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:33 crc kubenswrapper[5127]: W0201 06:47:33.141666 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:33 crc kubenswrapper[5127]: E0201 06:47:33.142215 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.163143 5127 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.164134 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 11:49:19.700540853 +0000 UTC Feb 01 06:47:33 crc kubenswrapper[5127]: E0201 06:47:33.167981 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.277669 5127 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="10b0811bc4ca1fcf16fe57a6692ff3c9b62fb94e4e464fb18868389cb55749f7" exitCode=0 Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.277764 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"10b0811bc4ca1fcf16fe57a6692ff3c9b62fb94e4e464fb18868389cb55749f7"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.277811 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.278642 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.278675 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.278686 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.284260 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e70fe3ab1894d1c8fa7c60af268a4177bd430373470d1eb0c6c7d85756aa39a3"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.284500 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.286037 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.286070 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.286082 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.297215 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.297289 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.297304 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.297333 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.298286 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.298339 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.298350 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.301194 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.301332 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.301368 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.301250 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.301396 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b"} Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.302217 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.302259 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.302275 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.415774 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.417198 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.417246 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.417256 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:33 crc kubenswrapper[5127]: I0201 06:47:33.417287 5127 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:47:33 crc kubenswrapper[5127]: E0201 06:47:33.417807 5127 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Feb 01 06:47:33 crc kubenswrapper[5127]: W0201 06:47:33.561415 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Feb 01 06:47:33 crc kubenswrapper[5127]: E0201 06:47:33.561497 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.164328 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 15:05:54.622163153 +0000 UTC Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.309042 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818"} Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.309174 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.310640 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.310692 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.310710 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.312071 5127 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b1fe9b291c33b77d23d194243ad05329ad869ee3b459f2291503d40f3079515f" exitCode=0 Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.312156 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.312175 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b1fe9b291c33b77d23d194243ad05329ad869ee3b459f2291503d40f3079515f"} Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.312198 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.312242 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.312319 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.313688 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.313730 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.313742 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.313694 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.313897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.313918 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.314466 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.314523 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.314542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:34 crc kubenswrapper[5127]: I0201 06:47:34.530619 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.165382 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 05:06:30.187430541 +0000 UTC Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.319164 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8908ea21cbb09bc49a45bb88c61d337966b1006166e92da4ba4cc50ecfe47568"} Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.319227 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.319239 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6b6e9c0789ea3fba09e6acc8e9548395f6ad0333e9ac67f893c6338090943511"} Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.319262 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"740ca69e35708127ed32e647e8cd9a203b0696d3508b0fb8876e544983ced563"} Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.319292 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.320333 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.320394 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.320416 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.567441 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.567705 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.569177 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.569232 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.569249 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:35 crc kubenswrapper[5127]: I0201 06:47:35.924439 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.098498 5127 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.166463 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 03:55:21.457180582 +0000 UTC Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.328632 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.328706 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.329130 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"226c1b4a1565d1d3068623995df09dad0197b18855a84fb07f3fa49b393441db"} Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.329191 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"26b1ad49dcaaabd979a0f55208958b957478c7b93ef17cfd770fb6166c65e3e3"} Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.329300 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.330179 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.330218 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.330234 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.331012 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.331042 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.331059 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.618750 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.620732 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.620791 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.620808 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.620843 5127 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.675682 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.675951 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.677534 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.677631 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:36 crc kubenswrapper[5127]: I0201 06:47:36.677650 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.085344 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.166757 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 14:18:47.297219641 +0000 UTC Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.331209 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.331248 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.331251 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.332424 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.332477 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.332489 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.333057 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.333083 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.333092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.539425 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.539612 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.540819 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.540889 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.540908 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:37 crc kubenswrapper[5127]: I0201 06:47:37.549567 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.167928 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 22:32:26.393324412 +0000 UTC Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.334461 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.334512 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.334570 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.335968 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.336018 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.336015 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.336064 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.336037 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.336083 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:38 crc kubenswrapper[5127]: I0201 06:47:38.338822 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.168790 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 14:53:20.956136355 +0000 UTC Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.337524 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.337546 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.339060 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.339101 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.339117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.339189 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.339229 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.339248 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.642937 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.643184 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.644662 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.644747 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.644766 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.675770 5127 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 01 06:47:39 crc kubenswrapper[5127]: I0201 06:47:39.675881 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 06:47:40 crc kubenswrapper[5127]: I0201 06:47:40.170013 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:59:13.564967828 +0000 UTC Feb 01 06:47:40 crc kubenswrapper[5127]: E0201 06:47:40.318917 5127 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 01 06:47:41 crc kubenswrapper[5127]: I0201 06:47:41.171145 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 12:27:19.956067559 +0000 UTC Feb 01 06:47:41 crc kubenswrapper[5127]: I0201 06:47:41.396501 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:41 crc kubenswrapper[5127]: I0201 06:47:41.396771 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:41 crc kubenswrapper[5127]: I0201 06:47:41.398873 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:41 crc kubenswrapper[5127]: I0201 06:47:41.398948 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:41 crc kubenswrapper[5127]: I0201 06:47:41.398967 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:41 crc kubenswrapper[5127]: I0201 06:47:41.593446 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:42 crc kubenswrapper[5127]: I0201 06:47:42.172301 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 06:41:29.083449463 +0000 UTC Feb 01 06:47:42 crc kubenswrapper[5127]: I0201 06:47:42.347406 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:42 crc kubenswrapper[5127]: I0201 06:47:42.349090 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:42 crc kubenswrapper[5127]: I0201 06:47:42.349133 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:42 crc kubenswrapper[5127]: I0201 06:47:42.349143 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:43 crc kubenswrapper[5127]: I0201 06:47:43.172946 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 23:23:59.877192383 +0000 UTC Feb 01 06:47:43 crc kubenswrapper[5127]: W0201 06:47:43.990801 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 01 06:47:43 crc kubenswrapper[5127]: I0201 06:47:43.990957 5127 trace.go:236] Trace[1273294409]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:47:33.989) (total time: 10001ms): Feb 01 06:47:43 crc kubenswrapper[5127]: Trace[1273294409]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:47:43.990) Feb 01 06:47:43 crc kubenswrapper[5127]: Trace[1273294409]: [10.001514499s] [10.001514499s] END Feb 01 06:47:43 crc kubenswrapper[5127]: E0201 06:47:43.990999 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 01 06:47:44 crc kubenswrapper[5127]: I0201 06:47:44.164475 5127 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 01 06:47:44 crc kubenswrapper[5127]: I0201 06:47:44.173852 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 14:23:26.587724006 +0000 UTC Feb 01 06:47:44 crc kubenswrapper[5127]: W0201 06:47:44.366550 5127 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 01 06:47:44 crc kubenswrapper[5127]: I0201 06:47:44.366671 5127 trace.go:236] Trace[1330314049]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:47:34.365) (total time: 10001ms): Feb 01 06:47:44 crc kubenswrapper[5127]: Trace[1330314049]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:47:44.366) Feb 01 06:47:44 crc kubenswrapper[5127]: Trace[1330314049]: [10.001212471s] [10.001212471s] END Feb 01 06:47:44 crc kubenswrapper[5127]: E0201 06:47:44.366700 5127 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 01 06:47:44 crc kubenswrapper[5127]: I0201 06:47:44.751298 5127 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 01 06:47:44 crc kubenswrapper[5127]: I0201 06:47:44.751841 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 01 06:47:44 crc kubenswrapper[5127]: I0201 06:47:44.770275 5127 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 01 06:47:44 crc kubenswrapper[5127]: I0201 06:47:44.770370 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 01 06:47:45 crc kubenswrapper[5127]: I0201 06:47:45.175028 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 19:27:11.610590979 +0000 UTC Feb 01 06:47:45 crc kubenswrapper[5127]: I0201 06:47:45.929832 5127 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]log ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]etcd ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/generic-apiserver-start-informers ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/priority-and-fairness-filter ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-apiextensions-informers ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-apiextensions-controllers ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/crd-informer-synced ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-system-namespaces-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 01 06:47:45 crc kubenswrapper[5127]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/bootstrap-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/start-kube-aggregator-informers ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/apiservice-registration-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/apiservice-discovery-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]autoregister-completion ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/apiservice-openapi-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 01 06:47:45 crc kubenswrapper[5127]: livez check failed Feb 01 06:47:45 crc kubenswrapper[5127]: I0201 06:47:45.929927 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:47:46 crc kubenswrapper[5127]: I0201 06:47:46.175340 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 00:38:44.559403344 +0000 UTC Feb 01 06:47:47 crc kubenswrapper[5127]: I0201 06:47:47.176120 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 04:32:53.895473959 +0000 UTC Feb 01 06:47:47 crc kubenswrapper[5127]: I0201 06:47:47.239725 5127 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 01 06:47:47 crc kubenswrapper[5127]: I0201 06:47:47.627518 5127 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 01 06:47:48 crc kubenswrapper[5127]: I0201 06:47:48.177036 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:16:56.940016854 +0000 UTC Feb 01 06:47:48 crc kubenswrapper[5127]: I0201 06:47:48.378328 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 01 06:47:48 crc kubenswrapper[5127]: I0201 06:47:48.378566 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:48 crc kubenswrapper[5127]: I0201 06:47:48.380222 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:48 crc kubenswrapper[5127]: I0201 06:47:48.380279 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:48 crc kubenswrapper[5127]: I0201 06:47:48.380296 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:48 crc kubenswrapper[5127]: I0201 06:47:48.391780 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.177816 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 13:04:47.437476334 +0000 UTC Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.364206 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.365729 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.365769 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.365780 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.677310 5127 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.677454 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 01 06:47:49 crc kubenswrapper[5127]: E0201 06:47:49.766031 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.768838 5127 trace.go:236] Trace[1509636706]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:47:36.594) (total time: 13173ms): Feb 01 06:47:49 crc kubenswrapper[5127]: Trace[1509636706]: ---"Objects listed" error: 13173ms (06:47:49.768) Feb 01 06:47:49 crc kubenswrapper[5127]: Trace[1509636706]: [13.17394008s] [13.17394008s] END Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.768883 5127 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.770606 5127 trace.go:236] Trace[994127243]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 06:47:37.992) (total time: 11777ms): Feb 01 06:47:49 crc kubenswrapper[5127]: Trace[994127243]: ---"Objects listed" error: 11777ms (06:47:49.770) Feb 01 06:47:49 crc kubenswrapper[5127]: Trace[994127243]: [11.777781218s] [11.777781218s] END Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.770628 5127 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.771327 5127 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.772729 5127 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 01 06:47:49 crc kubenswrapper[5127]: E0201 06:47:49.772806 5127 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.794079 5127 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35378->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.794160 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35378->192.168.126.11:17697: read: connection reset by peer" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.794381 5127 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35386->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.794430 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35386->192.168.126.11:17697: read: connection reset by peer" Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.895019 5127 csr.go:261] certificate signing request csr-j72d5 is approved, waiting to be issued Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.908683 5127 csr.go:257] certificate signing request csr-j72d5 is issued Feb 01 06:47:49 crc kubenswrapper[5127]: I0201 06:47:49.999476 5127 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 01 06:47:49 crc kubenswrapper[5127]: W0201 06:47:49.999818 5127 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:47:49 crc kubenswrapper[5127]: W0201 06:47:49.999831 5127 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 01 06:47:49 crc kubenswrapper[5127]: E0201 06:47:49.999785 5127 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": read tcp 38.102.83.22:52004->38.102.83.22:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-apiserver-crc.18900c8c8b8c4018 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:47:31.332784152 +0000 UTC m=+1.818686545,LastTimestamp:2026-02-01 06:47:31.332784152 +0000 UTC m=+1.818686545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.148628 5127 apiserver.go:52] "Watching apiserver" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.150250 5127 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.150534 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.150936 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.151026 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.151056 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.151063 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.151482 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.151610 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.152685 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.154059 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.154137 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.157195 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.157544 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.159625 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.159700 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.159716 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.159784 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.159784 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.160111 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.161899 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.163908 5127 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.172530 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174394 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174510 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174615 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174683 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174751 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174816 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174878 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.174946 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175040 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175105 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175174 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175239 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175306 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175369 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175433 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175569 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175659 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175757 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175859 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.175939 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176010 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176070 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176145 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176206 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176218 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176278 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176299 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176315 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176351 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176368 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176382 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176410 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176407 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176429 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176495 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176555 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176767 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176805 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176859 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176897 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176924 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176947 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176969 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.176992 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177016 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177034 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177038 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177073 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177069 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177091 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177165 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177200 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177326 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177398 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177445 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177495 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177543 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177571 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177616 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177640 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177662 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177678 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177663 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177698 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177718 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177734 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177752 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177773 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177789 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177804 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177819 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177837 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177856 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177900 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177866 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177919 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177935 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177955 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177968 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.177997 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178021 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178043 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178066 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178087 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178116 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178110 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178155 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178275 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178357 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178670 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178681 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.178738 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179168 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:04:57.56764974 +0000 UTC Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179362 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179410 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179532 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179618 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179647 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179803 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.179990 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180053 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180156 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180251 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180315 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180398 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180744 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180878 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.180988 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181173 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181231 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181267 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181264 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181285 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181291 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181303 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181322 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181341 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181422 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181558 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181601 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181625 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181643 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181660 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181676 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181693 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181709 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181727 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181742 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181758 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181773 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181789 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181810 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181825 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181841 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181857 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181877 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181893 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181910 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181924 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.181940 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.182138 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.182273 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.182627 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.182784 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183106 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183251 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183261 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183469 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183640 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183886 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185003 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183951 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185386 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189091 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189129 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189147 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189167 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189184 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189205 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189221 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189238 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189255 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189272 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189288 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189305 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189323 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189340 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189356 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189371 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189386 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189402 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189443 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189460 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189481 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189531 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189551 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189566 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189598 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189617 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189633 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189651 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189668 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189866 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189883 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189900 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189916 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189934 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189951 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189966 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189982 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190011 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190026 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190043 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190058 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190075 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190092 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190110 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190127 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190143 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190160 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191314 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191383 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191402 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191420 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191437 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191454 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191534 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191554 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191604 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191621 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191638 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191679 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191698 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191717 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191855 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191880 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192220 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192296 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192361 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192393 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192683 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192708 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192825 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192853 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192871 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192889 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.192940 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193189 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193234 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193253 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193279 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193778 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193829 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193849 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193868 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.193907 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194014 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194058 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194086 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194106 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194142 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194358 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194407 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194435 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194476 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194499 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194524 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194544 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194682 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194746 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194772 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194825 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194853 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.194870 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195010 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195213 5127 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195231 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195242 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195252 5127 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195261 5127 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195270 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195279 5127 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195289 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195299 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195308 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195318 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195327 5127 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195337 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195347 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195356 5127 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195366 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195375 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195386 5127 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195396 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195405 5127 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195414 5127 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195424 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195435 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195451 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195477 5127 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195487 5127 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195497 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195507 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195517 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195525 5127 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195535 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195543 5127 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195553 5127 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195562 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195572 5127 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195612 5127 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195622 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195631 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195640 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195650 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195661 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195672 5127 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195681 5127 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195690 5127 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183545 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183609 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183908 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.200188 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183961 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.184046 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.183559 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.184279 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.184521 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.184645 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.184976 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185103 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185572 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185658 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185913 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185883 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.185943 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.186060 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.186307 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.186524 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.187180 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.187291 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.187324 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.187426 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.187459 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.187574 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.187803 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.188165 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189248 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189626 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.189613 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190069 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190257 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190398 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.190809 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191195 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.191211 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195079 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195435 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195638 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195640 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.195686 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.196377 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.196569 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.198134 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.198289 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.198314 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.198389 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.198782 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.198966 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199105 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199171 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199213 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199269 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199741 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199887 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199894 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.199911 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.200038 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.200057 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.200459 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.200557 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.200691 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201044 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201082 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201086 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201335 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201391 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201463 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201536 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201562 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.201820 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.202412 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.202440 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.202553 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.202663 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.202668 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.202807 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.202826 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.202974 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.203148 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.203234 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.204198 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.204258 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.204741 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.204816 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.205029 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.205100 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.205573 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.205857 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.205877 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.206946 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.207015 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.207036 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.207241 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.207487 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.207712 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.208409 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.208571 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:50.708551825 +0000 UTC m=+21.194454188 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.209013 5127 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.209811 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:50.709789019 +0000 UTC m=+21.195691612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.209988 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.210124 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:47:50.710112457 +0000 UTC m=+21.196014820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.210265 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.210687 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.210685 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.210919 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.210956 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.205126 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.211241 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.211576 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.211773 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.212352 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.212421 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.212495 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.212492 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.212546 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.212909 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.212910 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.213224 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.213298 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.213310 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.213481 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.214200 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.214748 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.216481 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.216807 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.224147 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225510 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225542 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225559 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225638 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225652 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:50.725633644 +0000 UTC m=+21.211536017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225659 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225682 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.225728 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:50.725713297 +0000 UTC m=+21.211615660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.225729 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.225781 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.226073 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.226119 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.226420 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.226461 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.227142 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.227337 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.227448 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.227474 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.227619 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.227641 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228073 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228093 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228089 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228374 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228471 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228526 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228641 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228662 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.229094 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228749 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228920 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.228974 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.229729 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.229811 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.230210 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.230390 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.230786 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.230895 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.231028 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.231445 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.231573 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.232008 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.232132 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.234639 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.236565 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.249555 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.258119 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.260909 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.276953 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.279342 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.281507 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.282658 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.286521 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.287387 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.290068 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.290574 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.292489 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.293067 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.295096 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.297295 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.297135 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.297923 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298574 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298325 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298778 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298935 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298955 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298965 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298975 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298983 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.298995 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299004 5127 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299012 5127 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299021 5127 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299033 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299041 5127 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299050 5127 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299057 5127 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299105 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299114 5127 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299124 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299136 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299146 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299154 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299163 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299175 5127 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299183 5127 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299192 5127 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299200 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299211 5127 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299219 5127 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299227 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299239 5127 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299246 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299254 5127 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299262 5127 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299273 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299281 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299290 5127 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299297 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299308 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299316 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299324 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299332 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299346 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299354 5127 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299363 5127 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299375 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299383 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299392 5127 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299401 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299413 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299422 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299535 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299646 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299909 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299958 5127 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299976 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299985 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.299999 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300008 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300076 5127 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300094 5127 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300104 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300116 5127 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300168 5127 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300191 5127 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300211 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300219 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300223 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300254 5127 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300300 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300313 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300325 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300334 5127 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300343 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300351 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300362 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300371 5127 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300379 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300388 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300399 5127 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300408 5127 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300417 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300430 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300439 5127 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300448 5127 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300457 5127 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300468 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300477 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300486 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300494 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300506 5127 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300515 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300524 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300537 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300548 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300557 5127 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300566 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300782 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300792 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300802 5127 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300811 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300823 5127 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300834 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300869 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300880 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300890 5127 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300901 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300911 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300920 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300930 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300941 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300951 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300960 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300970 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300981 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300990 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.300999 5127 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301011 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301019 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301028 5127 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301037 5127 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301048 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301057 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301066 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301075 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301086 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301113 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301123 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301131 5127 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301142 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301151 5127 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301161 5127 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301172 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301181 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301191 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301201 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301213 5127 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301224 5127 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301236 5127 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301279 5127 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301292 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301300 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301310 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301322 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301331 5127 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301340 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301349 5127 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301360 5127 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301369 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301378 5127 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301386 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301398 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301406 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301415 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301429 5127 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301495 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.301889 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.302444 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.302774 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.303369 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.304481 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.305160 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.306012 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.306641 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.308894 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.309350 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.309856 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.310977 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.311667 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.313319 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.314515 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.315376 5127 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.315476 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.317686 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.318513 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.318869 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.318915 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.320788 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.321389 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.321919 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.322931 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.323902 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.324346 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.324931 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.325943 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.326861 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.327289 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.328233 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.331142 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.331927 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.333522 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.334034 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.334520 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.335743 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.336469 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.339735 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.340374 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.345002 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.356828 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.367488 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.367741 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.369483 5127 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818" exitCode=255 Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.369526 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818"} Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.378663 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.387873 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.401240 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.401748 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.409412 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.418806 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.426390 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.471746 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.485126 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 06:47:50 crc kubenswrapper[5127]: W0201 06:47:50.494748 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-9227a67ceb68c83553b8a776537f64742ec672d4894010d186c2c936d0424a70 WatchSource:0}: Error finding container 9227a67ceb68c83553b8a776537f64742ec672d4894010d186c2c936d0424a70: Status 404 returned error can't find the container with id 9227a67ceb68c83553b8a776537f64742ec672d4894010d186c2c936d0424a70 Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.494864 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 06:47:50 crc kubenswrapper[5127]: W0201 06:47:50.509761 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-2c42e5d3ba4ad2d4d426b1979f005bceabcb1c0b20af241e7f2cee32750d7d0a WatchSource:0}: Error finding container 2c42e5d3ba4ad2d4d426b1979f005bceabcb1c0b20af241e7f2cee32750d7d0a: Status 404 returned error can't find the container with id 2c42e5d3ba4ad2d4d426b1979f005bceabcb1c0b20af241e7f2cee32750d7d0a Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.540303 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.540310 5127 scope.go:117] "RemoveContainer" containerID="b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.783931 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-s2frk"] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.784307 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.784815 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xpm98"] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.785077 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cmdjj"] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.785196 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.785287 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.790616 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.790623 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.790840 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.790962 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.791230 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.791300 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.791478 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.792404 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.792714 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.792977 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6d7gz"] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.794608 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.795085 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.795724 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.801168 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.801197 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.801346 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.801364 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.804058 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.804689 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.804832 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.804922 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805000 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:47:51.804976291 +0000 UTC m=+22.290878674 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.805048 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.805102 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805192 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805235 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:51.805225068 +0000 UTC m=+22.291127431 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805303 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805335 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:51.805326381 +0000 UTC m=+22.291228744 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805436 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805505 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805560 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805689 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:51.80567659 +0000 UTC m=+22.291578953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805799 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.805859 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.806009 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: E0201 06:47:50.806082 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:51.806073701 +0000 UTC m=+22.291976064 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.820203 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.832809 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.842593 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.852343 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.868068 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.876563 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.883188 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.889444 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.901515 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905812 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krjrr\" (UniqueName: \"kubernetes.io/projected/31eb743e-decb-4243-ae21-91cc7b399ce1-kube-api-access-krjrr\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905854 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-cni-bin\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905870 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-daemon-config\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905888 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-system-cni-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905904 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-k8s-cni-cncf-io\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905920 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-mcd-auth-proxy-config\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905938 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31eb743e-decb-4243-ae21-91cc7b399ce1-cni-binary-copy\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905956 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da77835b-2181-45cd-837e-b633fd15a3c5-hosts-file\") pod \"node-resolver-xpm98\" (UID: \"da77835b-2181-45cd-837e-b633fd15a3c5\") " pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905972 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-cnibin\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.905987 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-conf-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906004 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-cnibin\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906070 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-proxy-tls\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906149 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-socket-dir-parent\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906169 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7t7p\" (UniqueName: \"kubernetes.io/projected/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-kube-api-access-d7t7p\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906192 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-kubelet\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906208 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-os-release\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906240 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31eb743e-decb-4243-ae21-91cc7b399ce1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906255 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-os-release\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906269 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klzfm\" (UniqueName: \"kubernetes.io/projected/4d959741-37e1-43e7-9ef6-5f33433f9447-kube-api-access-klzfm\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906282 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-system-cni-dir\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906333 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906348 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-rootfs\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906370 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-cni-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906383 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d959741-37e1-43e7-9ef6-5f33433f9447-cni-binary-copy\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906424 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-hostroot\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906438 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-etc-kubernetes\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906488 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-netns\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906508 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-cni-multus\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906528 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-multus-certs\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.906561 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jv5l\" (UniqueName: \"kubernetes.io/projected/da77835b-2181-45cd-837e-b633fd15a3c5-kube-api-access-7jv5l\") pod \"node-resolver-xpm98\" (UID: \"da77835b-2181-45cd-837e-b633fd15a3c5\") " pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.909795 5127 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-01 06:42:49 +0000 UTC, rotation deadline is 2026-12-08 08:40:44.484654369 +0000 UTC Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.909860 5127 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7441h52m53.574797815s for next certificate rotation Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.912326 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.921445 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.928299 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.932463 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.939536 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.953654 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.964570 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.976489 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.985341 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:50 crc kubenswrapper[5127]: I0201 06:47:50.995775 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.007817 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.007998 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-cni-bin\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008052 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-daemon-config\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008085 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-system-cni-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008119 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-k8s-cni-cncf-io\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008154 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-mcd-auth-proxy-config\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008188 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da77835b-2181-45cd-837e-b633fd15a3c5-hosts-file\") pod \"node-resolver-xpm98\" (UID: \"da77835b-2181-45cd-837e-b633fd15a3c5\") " pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008211 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-system-cni-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008217 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-cnibin\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008282 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-conf-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008295 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-cnibin\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008305 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31eb743e-decb-4243-ae21-91cc7b399ce1-cni-binary-copy\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008344 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-cnibin\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008347 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-k8s-cni-cncf-io\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008367 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-proxy-tls\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008388 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7t7p\" (UniqueName: \"kubernetes.io/projected/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-kube-api-access-d7t7p\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008419 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-socket-dir-parent\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008462 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-kubelet\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008478 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-os-release\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008512 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31eb743e-decb-4243-ae21-91cc7b399ce1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008540 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-system-cni-dir\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008556 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-os-release\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008570 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klzfm\" (UniqueName: \"kubernetes.io/projected/4d959741-37e1-43e7-9ef6-5f33433f9447-kube-api-access-klzfm\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008617 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-cni-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008632 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d959741-37e1-43e7-9ef6-5f33433f9447-cni-binary-copy\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008665 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008680 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-rootfs\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008714 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-hostroot\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008729 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-etc-kubernetes\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008745 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-cni-multus\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008764 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-multus-certs\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008820 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-netns\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008836 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jv5l\" (UniqueName: \"kubernetes.io/projected/da77835b-2181-45cd-837e-b633fd15a3c5-kube-api-access-7jv5l\") pod \"node-resolver-xpm98\" (UID: \"da77835b-2181-45cd-837e-b633fd15a3c5\") " pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008862 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krjrr\" (UniqueName: \"kubernetes.io/projected/31eb743e-decb-4243-ae21-91cc7b399ce1-kube-api-access-krjrr\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008951 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-daemon-config\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009247 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-os-release\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009324 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-cnibin\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009330 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-cni-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009249 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-conf-dir\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009456 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-etc-kubernetes\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.008152 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-cni-bin\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009470 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/da77835b-2181-45cd-837e-b633fd15a3c5-hosts-file\") pod \"node-resolver-xpm98\" (UID: \"da77835b-2181-45cd-837e-b633fd15a3c5\") " pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009519 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-multus-certs\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009385 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-mcd-auth-proxy-config\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009629 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-cni-multus\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009682 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-run-netns\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009849 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d959741-37e1-43e7-9ef6-5f33433f9447-cni-binary-copy\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009927 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-rootfs\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009954 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-hostroot\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.009960 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31eb743e-decb-4243-ae21-91cc7b399ce1-cni-binary-copy\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.010042 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-os-release\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.010073 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-host-var-lib-kubelet\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.010098 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-system-cni-dir\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.010111 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d959741-37e1-43e7-9ef6-5f33433f9447-multus-socket-dir-parent\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.010335 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31eb743e-decb-4243-ae21-91cc7b399ce1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.010478 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31eb743e-decb-4243-ae21-91cc7b399ce1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.014557 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-proxy-tls\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.022510 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.026040 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7t7p\" (UniqueName: \"kubernetes.io/projected/874ffcf5-fe2e-4225-a2a1-38f900cbffaf-kube-api-access-d7t7p\") pod \"machine-config-daemon-s2frk\" (UID: \"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\") " pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.031836 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jv5l\" (UniqueName: \"kubernetes.io/projected/da77835b-2181-45cd-837e-b633fd15a3c5-kube-api-access-7jv5l\") pod \"node-resolver-xpm98\" (UID: \"da77835b-2181-45cd-837e-b633fd15a3c5\") " pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.032440 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klzfm\" (UniqueName: \"kubernetes.io/projected/4d959741-37e1-43e7-9ef6-5f33433f9447-kube-api-access-klzfm\") pod \"multus-cmdjj\" (UID: \"4d959741-37e1-43e7-9ef6-5f33433f9447\") " pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.033129 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krjrr\" (UniqueName: \"kubernetes.io/projected/31eb743e-decb-4243-ae21-91cc7b399ce1-kube-api-access-krjrr\") pod \"multus-additional-cni-plugins-6d7gz\" (UID: \"31eb743e-decb-4243-ae21-91cc7b399ce1\") " pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.037084 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.051538 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.059657 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.083736 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.102200 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.111086 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xpm98" Feb 01 06:47:51 crc kubenswrapper[5127]: W0201 06:47:51.113040 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874ffcf5_fe2e_4225_a2a1_38f900cbffaf.slice/crio-cca8db996b6a96e3e099e9849ec983b58ad9437c07c16b8b8afa2e9407c717b8 WatchSource:0}: Error finding container cca8db996b6a96e3e099e9849ec983b58ad9437c07c16b8b8afa2e9407c717b8: Status 404 returned error can't find the container with id cca8db996b6a96e3e099e9849ec983b58ad9437c07c16b8b8afa2e9407c717b8 Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.117372 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cmdjj" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.123559 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.127745 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: W0201 06:47:51.140199 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d959741_37e1_43e7_9ef6_5f33433f9447.slice/crio-f01c89b6429462aebb99d8edc412c575fd3d929ce5872835de97a5451818cfab WatchSource:0}: Error finding container f01c89b6429462aebb99d8edc412c575fd3d929ce5872835de97a5451818cfab: Status 404 returned error can't find the container with id f01c89b6429462aebb99d8edc412c575fd3d929ce5872835de97a5451818cfab Feb 01 06:47:51 crc kubenswrapper[5127]: W0201 06:47:51.142070 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31eb743e_decb_4243_ae21_91cc7b399ce1.slice/crio-107d0fd903552a7e67ba3a6d37f0a49ddfdecd8dcf0c839ef0116ee576d71bbe WatchSource:0}: Error finding container 107d0fd903552a7e67ba3a6d37f0a49ddfdecd8dcf0c839ef0116ee576d71bbe: Status 404 returned error can't find the container with id 107d0fd903552a7e67ba3a6d37f0a49ddfdecd8dcf0c839ef0116ee576d71bbe Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.142931 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.152628 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.167169 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-njlcv"] Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.167861 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.175888 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.179318 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 20:55:23.668910486 +0000 UTC Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.179499 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.179858 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.179964 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.180056 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.180164 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.180338 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.180433 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.188655 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.199942 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.209156 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210547 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-kubelet\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210574 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-systemd\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210611 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptwjj\" (UniqueName: \"kubernetes.io/projected/5034ec6a-7968-4592-a09b-a57a56ebdbc5-kube-api-access-ptwjj\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210629 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-node-log\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210653 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-ovn\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210666 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-bin\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210680 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-netns\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210694 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210716 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-log-socket\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210737 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-slash\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210750 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-var-lib-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210772 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210787 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-etc-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210807 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210820 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-netd\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210836 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-config\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210851 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovn-node-metrics-cert\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210865 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-env-overrides\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210880 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-script-lib\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.210895 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-systemd-units\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.221534 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.233683 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.255318 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.264693 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.278316 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311334 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311372 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-etc-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311390 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311406 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-netd\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311431 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-config\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311433 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311448 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-etc-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311445 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovn-node-metrics-cert\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311496 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-env-overrides\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311527 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-script-lib\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311544 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-systemd-units\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311559 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-kubelet\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311574 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-systemd\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311604 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptwjj\" (UniqueName: \"kubernetes.io/projected/5034ec6a-7968-4592-a09b-a57a56ebdbc5-kube-api-access-ptwjj\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311595 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-ovn-kubernetes\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311619 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-node-log\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311640 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-node-log\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311703 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-netns\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311721 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-ovn\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311738 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-bin\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311783 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311810 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-log-socket\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311836 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-slash\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311857 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-var-lib-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.311909 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-var-lib-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312087 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-kubelet\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312156 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-systemd-units\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312159 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-config\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312213 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-bin\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312247 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-netns\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312276 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-script-lib\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312286 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-ovn\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312312 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-systemd\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312317 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-netd\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312388 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-env-overrides\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312461 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-openvswitch\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312465 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-slash\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.312486 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-log-socket\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.315005 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovn-node-metrics-cert\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.318834 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.345369 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptwjj\" (UniqueName: \"kubernetes.io/projected/5034ec6a-7968-4592-a09b-a57a56ebdbc5-kube-api-access-ptwjj\") pod \"ovnkube-node-njlcv\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.372954 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerStarted","Data":"107d0fd903552a7e67ba3a6d37f0a49ddfdecd8dcf0c839ef0116ee576d71bbe"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.374734 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerStarted","Data":"0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.374770 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerStarted","Data":"f01c89b6429462aebb99d8edc412c575fd3d929ce5872835de97a5451818cfab"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.376691 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xpm98" event={"ID":"da77835b-2181-45cd-837e-b633fd15a3c5","Type":"ContainerStarted","Data":"5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.376727 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xpm98" event={"ID":"da77835b-2181-45cd-837e-b633fd15a3c5","Type":"ContainerStarted","Data":"81e9766bab316d7f7f9d955936ab9e23addbf34f063795fdec6522512b42ef66"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.378519 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.378546 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"cca8db996b6a96e3e099e9849ec983b58ad9437c07c16b8b8afa2e9407c717b8"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.380286 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.380327 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.380337 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9ea7001ae43f43ed0f78b7d079f5af8b9876d2d6be055741a0facabb9677574b"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.382248 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2c42e5d3ba4ad2d4d426b1979f005bceabcb1c0b20af241e7f2cee32750d7d0a"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.384010 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.384783 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.384824 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9227a67ceb68c83553b8a776537f64742ec672d4894010d186c2c936d0424a70"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.387075 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.388359 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3"} Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.388918 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.392246 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.420536 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.463393 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.500728 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.575317 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.585059 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:51 crc kubenswrapper[5127]: W0201 06:47:51.598526 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5034ec6a_7968_4592_a09b_a57a56ebdbc5.slice/crio-0e594eee15da8e810adc438a0682ab8121b6fd3fc0e2b99d2df197f8c78f2d9b WatchSource:0}: Error finding container 0e594eee15da8e810adc438a0682ab8121b6fd3fc0e2b99d2df197f8c78f2d9b: Status 404 returned error can't find the container with id 0e594eee15da8e810adc438a0682ab8121b6fd3fc0e2b99d2df197f8c78f2d9b Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.607017 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.622249 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.659162 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.697831 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.743784 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.783414 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.817273 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.817385 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.817414 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.817437 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817475 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:47:53.817443381 +0000 UTC m=+24.303345754 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817520 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.817531 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817570 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:53.817557994 +0000 UTC m=+24.303460347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817657 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817663 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817668 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817683 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817702 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817740 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817755 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817709 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:53.817697268 +0000 UTC m=+24.303599641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817833 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:53.817814181 +0000 UTC m=+24.303716614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:51 crc kubenswrapper[5127]: E0201 06:47:51.817860 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:53.817851952 +0000 UTC m=+24.303754425 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.819939 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.860228 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.906928 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.940023 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:51 crc kubenswrapper[5127]: I0201 06:47:51.983217 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.180502 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 23:06:00.178767068 +0000 UTC Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.235143 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.235232 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:52 crc kubenswrapper[5127]: E0201 06:47:52.235297 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.235356 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:52 crc kubenswrapper[5127]: E0201 06:47:52.235425 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:47:52 crc kubenswrapper[5127]: E0201 06:47:52.235486 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.392940 5127 generic.go:334] "Generic (PLEG): container finished" podID="31eb743e-decb-4243-ae21-91cc7b399ce1" containerID="7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02" exitCode=0 Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.393002 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerDied","Data":"7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02"} Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.395023 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028"} Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.396466 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8" exitCode=0 Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.396646 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.396689 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"0e594eee15da8e810adc438a0682ab8121b6fd3fc0e2b99d2df197f8c78f2d9b"} Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.411618 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.428713 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.443967 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.459955 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.491916 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.504439 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.559392 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.570957 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.591626 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.603205 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.612802 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.627515 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.638344 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.649104 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.661119 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.674515 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.675966 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kqhgg"] Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.676472 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.678257 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.678325 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.691259 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.711103 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.726427 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/184f0be4-bae6-4988-8d01-862fa5745a14-host\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.726487 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dpx\" (UniqueName: \"kubernetes.io/projected/184f0be4-bae6-4988-8d01-862fa5745a14-kube-api-access-t6dpx\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.726516 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/184f0be4-bae6-4988-8d01-862fa5745a14-serviceca\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.736631 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.780937 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.820347 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.827313 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/184f0be4-bae6-4988-8d01-862fa5745a14-host\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.827362 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dpx\" (UniqueName: \"kubernetes.io/projected/184f0be4-bae6-4988-8d01-862fa5745a14-kube-api-access-t6dpx\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.827379 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/184f0be4-bae6-4988-8d01-862fa5745a14-serviceca\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.827468 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/184f0be4-bae6-4988-8d01-862fa5745a14-host\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.828278 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/184f0be4-bae6-4988-8d01-862fa5745a14-serviceca\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.865974 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dpx\" (UniqueName: \"kubernetes.io/projected/184f0be4-bae6-4988-8d01-862fa5745a14-kube-api-access-t6dpx\") pod \"node-ca-kqhgg\" (UID: \"184f0be4-bae6-4988-8d01-862fa5745a14\") " pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.885361 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.921877 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.966247 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:52 crc kubenswrapper[5127]: I0201 06:47:52.996917 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.003087 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kqhgg" Feb 01 06:47:53 crc kubenswrapper[5127]: W0201 06:47:53.012702 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184f0be4_bae6_4988_8d01_862fa5745a14.slice/crio-b5269c551bbd8e612eed03daf435fd035093ff2346f06eb8d2c604e0774f39b9 WatchSource:0}: Error finding container b5269c551bbd8e612eed03daf435fd035093ff2346f06eb8d2c604e0774f39b9: Status 404 returned error can't find the container with id b5269c551bbd8e612eed03daf435fd035093ff2346f06eb8d2c604e0774f39b9 Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.050082 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.080800 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.122089 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.157803 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.181116 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 19:58:01.721055751 +0000 UTC Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.196956 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.239481 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.280814 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.318147 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.362880 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.398353 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.400924 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.402560 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kqhgg" event={"ID":"184f0be4-bae6-4988-8d01-862fa5745a14","Type":"ContainerStarted","Data":"0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.402605 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kqhgg" event={"ID":"184f0be4-bae6-4988-8d01-862fa5745a14","Type":"ContainerStarted","Data":"b5269c551bbd8e612eed03daf435fd035093ff2346f06eb8d2c604e0774f39b9"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.406157 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.406189 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.406204 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.406215 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.406226 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.406236 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.408093 5127 generic.go:334] "Generic (PLEG): container finished" podID="31eb743e-decb-4243-ae21-91cc7b399ce1" containerID="0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768" exitCode=0 Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.408159 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerDied","Data":"0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768"} Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.440561 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.482508 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.522310 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.564323 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.601811 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.639492 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.677765 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.720949 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.761182 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.803494 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.837373 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.837475 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837533 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:47:57.837500508 +0000 UTC m=+28.323402901 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.837617 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.837690 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837624 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.837744 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837791 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837796 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837830 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837840 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837847 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837865 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837874 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837900 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:57.837879648 +0000 UTC m=+28.323782031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837937 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:57.837914679 +0000 UTC m=+28.323817052 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837965 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:57.83795512 +0000 UTC m=+28.323857493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:53 crc kubenswrapper[5127]: E0201 06:47:53.837992 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:47:57.837983081 +0000 UTC m=+28.323885464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.839998 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.884413 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.925655 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:53 crc kubenswrapper[5127]: I0201 06:47:53.964654 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.003989 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.038523 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.090402 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.181609 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 06:38:17.0982907 +0000 UTC Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.234682 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.234695 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:54 crc kubenswrapper[5127]: E0201 06:47:54.234883 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:47:54 crc kubenswrapper[5127]: E0201 06:47:54.234941 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.234709 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:54 crc kubenswrapper[5127]: E0201 06:47:54.235143 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.414455 5127 generic.go:334] "Generic (PLEG): container finished" podID="31eb743e-decb-4243-ae21-91cc7b399ce1" containerID="50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4" exitCode=0 Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.414528 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerDied","Data":"50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4"} Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.432675 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.463955 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.480801 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.499266 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.516423 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.532297 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.551455 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.569800 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.583816 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.595886 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.610698 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.627342 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:54 crc kubenswrapper[5127]: I0201 06:47:54.646684 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:54Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.182497 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 17:29:46.831868907 +0000 UTC Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.420785 5127 generic.go:334] "Generic (PLEG): container finished" podID="31eb743e-decb-4243-ae21-91cc7b399ce1" containerID="01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea" exitCode=0 Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.420855 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerDied","Data":"01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea"} Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.443050 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.465827 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.480083 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.496681 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.514779 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.533346 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.545553 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.576043 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.601755 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.619050 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.634561 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.647753 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:55 crc kubenswrapper[5127]: I0201 06:47:55.667050 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:55Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.174499 5127 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.177110 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.177141 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.177149 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.177284 5127 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.183132 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 14:57:16.79203307 +0000 UTC Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.186963 5127 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.187305 5127 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.188788 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.188821 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.188831 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.188845 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.188855 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.202356 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.206739 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.206923 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.207056 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.207204 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.207337 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.221551 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.225402 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.225507 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.225527 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.225551 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.225690 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.235373 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.235441 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.235448 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.235886 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.235983 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.236048 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.239529 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.243108 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.243147 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.243162 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.243186 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.243202 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.256615 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.261035 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.261101 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.261118 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.261138 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.261158 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.275826 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: E0201 06:47:56.276028 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.278355 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.278397 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.278411 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.278431 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.278444 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.380772 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.380862 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.380880 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.380902 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.380918 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.428291 5127 generic.go:334] "Generic (PLEG): container finished" podID="31eb743e-decb-4243-ae21-91cc7b399ce1" containerID="c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0" exitCode=0 Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.428430 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerDied","Data":"c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.435775 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.442753 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.457688 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.473303 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.484446 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.484487 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.484600 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.484615 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.484631 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.486435 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.496227 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.508857 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.524197 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.536735 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.549572 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.566539 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.582874 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.587315 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.587340 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.587349 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.587363 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.587373 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.603676 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.614303 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.679400 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.685261 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.687770 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.689967 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.689992 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.690001 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.690014 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.690023 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.693239 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.707030 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.725567 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.736853 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.746350 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.757146 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.764937 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.780538 5127 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.787650 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.792512 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.792633 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.792661 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.792689 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.792709 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.806112 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.820719 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.837532 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.849303 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.866519 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.882303 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.893230 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.894599 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.894635 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.894644 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.894666 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.894675 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.906881 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.916411 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.929723 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.942286 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.957572 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.972927 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.991227 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:56Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.998106 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.998176 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.998201 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.998236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:56 crc kubenswrapper[5127]: I0201 06:47:56.998261 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:56Z","lastTransitionTime":"2026-02-01T06:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.008200 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.024278 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.037893 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.047463 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.057766 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.101404 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.101458 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.101475 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.101499 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.101517 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.183748 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 08:15:21.257402732 +0000 UTC Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.204440 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.204498 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.204514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.204539 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.204556 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.308118 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.308174 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.308193 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.308215 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.308234 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.411192 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.411240 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.411251 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.411268 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.411280 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.450828 5127 generic.go:334] "Generic (PLEG): container finished" podID="31eb743e-decb-4243-ae21-91cc7b399ce1" containerID="cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874" exitCode=0 Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.450972 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerDied","Data":"cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.467681 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.482055 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.507133 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.513743 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.513773 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.513784 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.513797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.513807 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.523565 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.543668 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.608234 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.620075 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.620134 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.620150 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.620177 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.620193 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.628951 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.641693 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.659557 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.677476 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.694004 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.714953 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.723218 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.723251 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.723259 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.723272 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.723280 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.731391 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.744922 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:57Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.825771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.825807 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.825815 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.825829 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.825838 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.879415 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.879635 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.879730 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:48:05.879682629 +0000 UTC m=+36.365585032 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.879826 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.879854 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.879874 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.879956 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:05.879933835 +0000 UTC m=+36.365836228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.880540 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.880694 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.880853 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:05.880833309 +0000 UTC m=+36.366735702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.880762 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.880963 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.880997 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.881089 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:05.881068356 +0000 UTC m=+36.366970759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.881113 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.881143 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.881162 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:57 crc kubenswrapper[5127]: E0201 06:47:57.881220 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:05.881206139 +0000 UTC m=+36.367108542 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.928483 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.928518 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.928527 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.928540 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:57 crc kubenswrapper[5127]: I0201 06:47:57.928549 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:57Z","lastTransitionTime":"2026-02-01T06:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.030868 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.030937 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.030955 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.030985 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.031004 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.133702 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.133752 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.133762 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.133780 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.133794 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.183895 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 04:28:47.895044321 +0000 UTC Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.234749 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.234815 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:47:58 crc kubenswrapper[5127]: E0201 06:47:58.234914 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.234766 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:47:58 crc kubenswrapper[5127]: E0201 06:47:58.235045 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:47:58 crc kubenswrapper[5127]: E0201 06:47:58.235135 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.236284 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.236319 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.236329 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.236342 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.236354 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.338621 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.338669 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.338681 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.338701 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.338713 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.441540 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.441590 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.441600 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.441612 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.441620 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.458414 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.458730 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.458793 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.458819 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.467371 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" event={"ID":"31eb743e-decb-4243-ae21-91cc7b399ce1","Type":"ContainerStarted","Data":"1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.475863 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.483990 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.488016 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.497404 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.513467 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.529108 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.543430 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.543463 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.543470 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.543483 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.543491 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.545313 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.565456 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.578993 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.589421 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.604940 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.621824 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.633700 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.644991 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.645292 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.645331 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.645343 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.645357 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.645368 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.654262 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.672112 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.683734 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.693660 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.704184 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.716756 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.733064 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.747417 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.747463 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.747476 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.747495 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.747508 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.750689 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.766780 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.780005 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.797031 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.810058 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.844653 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.850777 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.850828 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.850842 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.850868 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.850883 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.879351 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.890273 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.914272 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:47:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.953479 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.953514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.953521 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.953536 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:58 crc kubenswrapper[5127]: I0201 06:47:58.953546 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:58Z","lastTransitionTime":"2026-02-01T06:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.056092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.056134 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.056142 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.056155 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.056164 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.158945 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.158989 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.159005 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.159025 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.159040 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.184335 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 20:15:04.600741455 +0000 UTC Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.261965 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.262028 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.262045 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.262092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.262111 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.364713 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.364787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.364810 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.364841 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.364886 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.470389 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.470452 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.470514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.470543 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.470563 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.572962 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.573044 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.573064 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.573090 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.573106 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.676152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.676197 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.676212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.676231 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.676246 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.779749 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.779815 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.779840 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.779871 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.779895 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.882773 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.882859 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.882893 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.882924 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.882965 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.986719 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.986775 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.986785 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.986807 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:47:59 crc kubenswrapper[5127]: I0201 06:47:59.986820 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:47:59Z","lastTransitionTime":"2026-02-01T06:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.089263 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.089330 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.089343 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.089367 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.089381 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.185381 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 15:11:54.788667206 +0000 UTC Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.192766 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.192831 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.192845 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.192865 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.192883 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.234723 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.234820 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:00 crc kubenswrapper[5127]: E0201 06:48:00.234862 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.234994 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:00 crc kubenswrapper[5127]: E0201 06:48:00.235151 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:00 crc kubenswrapper[5127]: E0201 06:48:00.235310 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.248707 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.263646 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.281641 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.295076 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.295142 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.295161 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.295186 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.295238 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.301603 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.323662 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.343927 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.344291 5127 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.364188 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.373753 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.391248 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.397518 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.397572 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.397613 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.397638 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.397666 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.410460 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.423712 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.438095 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.451117 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.466069 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.500951 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.501042 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.501061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.501126 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.501147 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.603868 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.603930 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.603948 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.603970 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.603986 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.707092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.707145 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.707161 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.707187 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.707204 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.809449 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.809492 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.809525 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.809549 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.809563 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.911350 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.911390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.911421 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.911438 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:00 crc kubenswrapper[5127]: I0201 06:48:00.911450 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:00Z","lastTransitionTime":"2026-02-01T06:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.013730 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.013797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.013814 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.013838 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.013855 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.116796 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.116835 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.116891 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.116908 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.116919 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.186455 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 04:58:04.192352967 +0000 UTC Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.219553 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.219661 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.219689 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.219718 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.219737 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.322855 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.322919 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.322943 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.322971 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.322993 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.426177 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.426239 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.426258 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.426284 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.426302 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.484657 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/0.log" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.488658 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7" exitCode=1 Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.488689 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.491533 5127 scope.go:117] "RemoveContainer" containerID="1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.508814 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.529335 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.529665 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.529841 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.530040 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.530248 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.530778 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.555879 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.569173 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.586474 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.602742 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.634613 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.634667 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.634682 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.634704 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.634719 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.638342 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.653731 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.679025 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:00Z\\\",\\\"message\\\":\\\"versions/factory.go:140\\\\nI0201 06:48:00.719981 6535 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720243 6535 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720446 6535 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720674 6535 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720854 6535 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720977 6535 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.721393 6535 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:00.721406 6535 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:00.721439 6535 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:00.721463 6535 factory.go:656] Stopping watch factory\\\\nI0201 06:48:00.721479 6535 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.699708 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.717528 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.734071 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.737098 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.737271 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.737400 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.737554 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.737727 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.749611 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.775376 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:01Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.840343 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.840378 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.840390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.840407 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.840417 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.943461 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.943541 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.943558 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.943643 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:01 crc kubenswrapper[5127]: I0201 06:48:01.943664 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:01Z","lastTransitionTime":"2026-02-01T06:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.046153 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.046207 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.046222 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.046241 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.046255 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.180300 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.180332 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.180343 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.180358 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.180366 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.186820 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 21:39:08.955882275 +0000 UTC Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.234947 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.234969 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:02 crc kubenswrapper[5127]: E0201 06:48:02.235093 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.235153 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:02 crc kubenswrapper[5127]: E0201 06:48:02.235291 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:02 crc kubenswrapper[5127]: E0201 06:48:02.235414 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.283550 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.283639 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.283657 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.283681 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.283701 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.386449 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.386503 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.386517 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.386537 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.386552 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.489040 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.489129 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.489142 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.489159 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.489199 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.493455 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/0.log" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.496753 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.497317 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.512168 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.534087 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.550420 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.571125 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:00Z\\\",\\\"message\\\":\\\"versions/factory.go:140\\\\nI0201 06:48:00.719981 6535 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720243 6535 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720446 6535 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720674 6535 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720854 6535 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720977 6535 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.721393 6535 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:00.721406 6535 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:00.721439 6535 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:00.721463 6535 factory.go:656] Stopping watch factory\\\\nI0201 06:48:00.721479 6535 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.583314 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.592201 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.592247 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.592264 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.592285 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.592303 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.603912 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.621943 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.633470 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.645464 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.659212 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.670887 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.683625 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.694964 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.694999 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.695010 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.695024 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.695035 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.697950 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.711004 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:02Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.798226 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.798285 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.798302 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.798325 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.798344 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.901606 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.901658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.901668 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.901683 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:02 crc kubenswrapper[5127]: I0201 06:48:02.901693 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:02Z","lastTransitionTime":"2026-02-01T06:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.008342 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.008404 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.008426 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.008458 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.008481 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.112180 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.112889 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.112961 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.113002 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.113032 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.187870 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 05:18:14.234093995 +0000 UTC Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.216810 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.216907 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.216924 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.216948 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.216965 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.308105 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr"] Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.308796 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.312418 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.312418 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.320845 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.320916 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.320937 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.320964 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.320985 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.326752 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.351605 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:00Z\\\",\\\"message\\\":\\\"versions/factory.go:140\\\\nI0201 06:48:00.719981 6535 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720243 6535 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720446 6535 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720674 6535 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720854 6535 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720977 6535 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.721393 6535 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:00.721406 6535 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:00.721439 6535 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:00.721463 6535 factory.go:656] Stopping watch factory\\\\nI0201 06:48:00.721479 6535 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.369475 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.381521 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.396972 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.409512 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.422940 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.423278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.423304 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.423313 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.423327 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.423337 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.434125 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.440109 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a16bde8c-7758-4d94-a246-bbafcff4d733-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.440196 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a16bde8c-7758-4d94-a246-bbafcff4d733-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.440263 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j5pd\" (UniqueName: \"kubernetes.io/projected/a16bde8c-7758-4d94-a246-bbafcff4d733-kube-api-access-7j5pd\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.440314 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a16bde8c-7758-4d94-a246-bbafcff4d733-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.447475 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.459946 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.472571 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.482200 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.494418 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.500416 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/1.log" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.501019 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/0.log" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.503918 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee" exitCode=1 Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.503966 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.504032 5127 scope.go:117] "RemoveContainer" containerID="1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.504677 5127 scope.go:117] "RemoveContainer" containerID="a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee" Feb 01 06:48:03 crc kubenswrapper[5127]: E0201 06:48:03.504893 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.510794 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.525818 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.526297 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.526432 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.526625 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.526755 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.527983 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.541804 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a16bde8c-7758-4d94-a246-bbafcff4d733-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.542025 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a16bde8c-7758-4d94-a246-bbafcff4d733-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.542156 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j5pd\" (UniqueName: \"kubernetes.io/projected/a16bde8c-7758-4d94-a246-bbafcff4d733-kube-api-access-7j5pd\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.542275 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a16bde8c-7758-4d94-a246-bbafcff4d733-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.542570 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a16bde8c-7758-4d94-a246-bbafcff4d733-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.542926 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a16bde8c-7758-4d94-a246-bbafcff4d733-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.543270 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.548755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a16bde8c-7758-4d94-a246-bbafcff4d733-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.558948 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.559873 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j5pd\" (UniqueName: \"kubernetes.io/projected/a16bde8c-7758-4d94-a246-bbafcff4d733-kube-api-access-7j5pd\") pod \"ovnkube-control-plane-749d76644c-zjfhr\" (UID: \"a16bde8c-7758-4d94-a246-bbafcff4d733\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.575357 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.591048 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.603207 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.621468 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.627928 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.630237 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.630309 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.630333 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.630362 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.630387 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: W0201 06:48:03.639741 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda16bde8c_7758_4d94_a246_bbafcff4d733.slice/crio-0e45f432af2b6b63fb54bb0aa4f801ebbdc0eca3d38b948b4a30854b25e81c29 WatchSource:0}: Error finding container 0e45f432af2b6b63fb54bb0aa4f801ebbdc0eca3d38b948b4a30854b25e81c29: Status 404 returned error can't find the container with id 0e45f432af2b6b63fb54bb0aa4f801ebbdc0eca3d38b948b4a30854b25e81c29 Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.641266 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.660743 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.673134 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.690748 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.708504 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.723033 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.732890 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.733108 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.733301 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.733479 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.733710 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.738045 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.758210 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:00Z\\\",\\\"message\\\":\\\"versions/factory.go:140\\\\nI0201 06:48:00.719981 6535 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720243 6535 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720446 6535 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720674 6535 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720854 6535 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720977 6535 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.721393 6535 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:00.721406 6535 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:00.721439 6535 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:00.721463 6535 factory.go:656] Stopping watch factory\\\\nI0201 06:48:00.721479 6535 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.771909 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:03Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.836240 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.836280 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.836291 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.836306 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.836316 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.939810 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.939848 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.939862 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.939879 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:03 crc kubenswrapper[5127]: I0201 06:48:03.939893 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:03Z","lastTransitionTime":"2026-02-01T06:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.042352 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.042387 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.042395 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.042412 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.042420 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.144750 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.144788 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.144797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.144813 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.144821 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.188525 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 00:07:19.007097146 +0000 UTC Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.235089 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.235104 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.235158 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:04 crc kubenswrapper[5127]: E0201 06:48:04.235204 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:04 crc kubenswrapper[5127]: E0201 06:48:04.235302 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:04 crc kubenswrapper[5127]: E0201 06:48:04.235379 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.246509 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.246720 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.246850 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.246977 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.247119 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.350737 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.350780 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.350793 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.350816 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.350829 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.454099 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.454157 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.454175 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.454209 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.454241 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.509958 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" event={"ID":"a16bde8c-7758-4d94-a246-bbafcff4d733","Type":"ContainerStarted","Data":"5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.510049 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" event={"ID":"a16bde8c-7758-4d94-a246-bbafcff4d733","Type":"ContainerStarted","Data":"24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.510075 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" event={"ID":"a16bde8c-7758-4d94-a246-bbafcff4d733","Type":"ContainerStarted","Data":"0e45f432af2b6b63fb54bb0aa4f801ebbdc0eca3d38b948b4a30854b25e81c29"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.511964 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/1.log" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.516315 5127 scope.go:117] "RemoveContainer" containerID="a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee" Feb 01 06:48:04 crc kubenswrapper[5127]: E0201 06:48:04.516534 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.525034 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.550787 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.556253 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.556303 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.556314 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.556335 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.556347 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.569274 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.589847 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.603014 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.613095 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.624488 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.637973 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.650383 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.658657 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.658693 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.658706 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.658720 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.658729 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.662290 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.674143 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.684677 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.699118 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.711172 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.733733 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1719f5b5a503ff78dbdfbee8aa195fe3569f8c35cba5a8aaab3654dceb8346d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:00Z\\\",\\\"message\\\":\\\"versions/factory.go:140\\\\nI0201 06:48:00.719981 6535 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720243 6535 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0201 06:48:00.720446 6535 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720674 6535 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720854 6535 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.720977 6535 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0201 06:48:00.721393 6535 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:00.721406 6535 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:00.721439 6535 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:00.721463 6535 factory.go:656] Stopping watch factory\\\\nI0201 06:48:00.721479 6535 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.746040 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.756949 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.760827 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.760922 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.760980 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.761100 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.761170 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.772234 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.785681 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.806454 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.827645 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.831100 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ls5xc"] Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.831618 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:04 crc kubenswrapper[5127]: E0201 06:48:04.831735 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.840906 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.852775 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.863278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.863417 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.863503 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.863577 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.863667 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.863868 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.876603 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.885561 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.896163 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.912255 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.928688 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.943605 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.958754 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.958930 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd8tm\" (UniqueName: \"kubernetes.io/projected/bafc814f-6c41-40cf-b3f4-8babc6ec840a-kube-api-access-vd8tm\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.959072 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.965758 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.965813 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.965826 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.965846 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.965899 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:04Z","lastTransitionTime":"2026-02-01T06:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:04 crc kubenswrapper[5127]: I0201 06:48:04.977490 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.000357 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:04Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.022949 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.044972 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.059966 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.060174 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.060278 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:48:05.560250403 +0000 UTC m=+36.046152806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.060427 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd8tm\" (UniqueName: \"kubernetes.io/projected/bafc814f-6c41-40cf-b3f4-8babc6ec840a-kube-api-access-vd8tm\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.063116 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.069005 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.069080 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.069103 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.069132 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.069153 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.083307 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.086494 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd8tm\" (UniqueName: \"kubernetes.io/projected/bafc814f-6c41-40cf-b3f4-8babc6ec840a-kube-api-access-vd8tm\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.110259 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.132449 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.152082 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.170178 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.171311 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.171365 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.171381 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.171405 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.171422 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.189545 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 03:50:12.407861855 +0000 UTC Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.190643 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.211177 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.234781 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.252283 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.274162 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.274220 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.274236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.274259 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.274274 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.283405 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:05Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.377513 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.377616 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.377640 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.377671 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.377693 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.480165 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.480234 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.480249 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.480271 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.480286 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.566199 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.566326 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.566398 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:48:06.56638148 +0000 UTC m=+37.052283843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.583167 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.583210 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.583219 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.583232 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.583242 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.686440 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.686489 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.686498 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.686514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.686527 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.789410 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.789473 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.789491 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.789515 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.789532 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.893048 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.893117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.893137 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.893162 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.893180 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.970614 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.970852 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.970907 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:48:21.970861131 +0000 UTC m=+52.456763554 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971035 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971063 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971085 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971166 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:21.971141419 +0000 UTC m=+52.457043832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971191 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971216 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971235 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971290 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:21.971275542 +0000 UTC m=+52.457177935 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.971035 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.971362 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.971412 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971522 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971563 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:21.9715503 +0000 UTC m=+52.457452693 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971680 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: E0201 06:48:05.971798 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:21.971772156 +0000 UTC m=+52.457674559 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.995828 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.995886 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.995902 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.995927 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:05 crc kubenswrapper[5127]: I0201 06:48:05.995946 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:05Z","lastTransitionTime":"2026-02-01T06:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.100139 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.100195 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.100215 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.100239 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.100259 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.190482 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 18:33:43.350483412 +0000 UTC Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.203893 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.203959 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.203978 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.204007 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.204025 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.234840 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.234951 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.235060 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.235163 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.235243 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.235383 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.235817 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.236190 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.307004 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.307457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.307740 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.307931 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.308112 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.346998 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.347035 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.347044 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.347059 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.347067 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.367434 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:06Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.373852 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.374067 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.374283 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.374529 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.374770 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.393846 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:06Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.399297 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.399340 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.399351 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.399366 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.399377 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.421451 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:06Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.428781 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.428854 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.428876 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.428911 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.428935 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.448914 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:06Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.454632 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.454712 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.454732 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.454755 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.454775 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.474522 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:06Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.474813 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.477106 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.477167 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.477188 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.477216 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.477238 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.578669 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.579517 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.579551 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.578849 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.579565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.579601 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: E0201 06:48:06.579625 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:48:08.579607429 +0000 UTC m=+39.065509792 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.579616 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.682471 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.682511 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.682522 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.682538 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.682550 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.785722 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.785786 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.785805 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.785830 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.785849 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.888082 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.888117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.888137 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.888152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.888161 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.990677 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.990717 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.990726 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.990741 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:06 crc kubenswrapper[5127]: I0201 06:48:06.990751 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:06Z","lastTransitionTime":"2026-02-01T06:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.093124 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.093607 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.093722 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.093818 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.093903 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.191649 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 19:18:39.176940867 +0000 UTC Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.196648 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.196704 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.196718 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.196737 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.196751 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.299416 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.299461 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.299475 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.299494 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.299505 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.402237 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.402279 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.402305 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.402332 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.402347 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.505654 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.505984 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.506116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.506243 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.506421 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.608724 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.608749 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.608761 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.608783 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.608794 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.711641 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.711973 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.712113 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.712292 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.712498 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.815695 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.815739 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.815755 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.815778 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.815798 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.919054 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.919100 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.919116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.919137 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:07 crc kubenswrapper[5127]: I0201 06:48:07.919153 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:07Z","lastTransitionTime":"2026-02-01T06:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.021640 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.021676 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.021685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.021698 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.021706 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.124914 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.124975 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.124997 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.125021 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.125040 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.192238 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 11:32:07.652913764 +0000 UTC Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.227401 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.227454 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.227465 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.227482 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.227494 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.234917 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:08 crc kubenswrapper[5127]: E0201 06:48:08.235039 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.235425 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.235438 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.235508 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:08 crc kubenswrapper[5127]: E0201 06:48:08.236041 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:08 crc kubenswrapper[5127]: E0201 06:48:08.236162 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:08 crc kubenswrapper[5127]: E0201 06:48:08.236405 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.330172 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.330516 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.330718 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.330867 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.330996 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.434389 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.434762 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.434908 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.435028 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.435139 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.538220 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.538295 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.538325 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.538358 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.538383 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.602409 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:08 crc kubenswrapper[5127]: E0201 06:48:08.602720 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:08 crc kubenswrapper[5127]: E0201 06:48:08.602849 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:48:12.60282072 +0000 UTC m=+43.088723113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.640980 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.641048 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.641080 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.641116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.641137 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.744053 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.744142 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.744169 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.744202 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.744233 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.847474 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.847521 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.847536 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.847561 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.847614 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.950212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.950247 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.950258 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.950272 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:08 crc kubenswrapper[5127]: I0201 06:48:08.950283 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:08Z","lastTransitionTime":"2026-02-01T06:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.053515 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.053559 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.053571 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.053607 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.053619 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.156599 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.156669 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.156688 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.156711 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.156728 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.193142 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 11:19:32.819731362 +0000 UTC Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.259738 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.259805 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.259826 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.259853 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.259871 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.362463 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.362538 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.362556 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.362625 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.362649 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.465330 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.465385 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.465398 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.465418 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.465431 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.568316 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.568369 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.568383 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.568401 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.568414 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.649559 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.663165 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.671517 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.671552 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.671564 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.671602 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.671615 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.678325 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.687940 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.698735 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.708289 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.720357 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.735036 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.745756 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.757674 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.768110 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.774098 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.774135 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.774144 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.774158 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.774168 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.786883 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.798440 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.809697 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.822930 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.838994 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.850766 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:09Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.876810 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.876861 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.876870 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.876882 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.876893 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.980564 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.980874 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.980973 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.981075 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:09 crc kubenswrapper[5127]: I0201 06:48:09.981164 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:09Z","lastTransitionTime":"2026-02-01T06:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.083293 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.083507 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.083570 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.083658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.083743 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.186879 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.187143 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.187225 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.187318 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.187406 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.194237 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 06:05:01.739753173 +0000 UTC Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.235328 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:10 crc kubenswrapper[5127]: E0201 06:48:10.235487 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.235570 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:10 crc kubenswrapper[5127]: E0201 06:48:10.235697 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.235970 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:10 crc kubenswrapper[5127]: E0201 06:48:10.236174 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.236196 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:10 crc kubenswrapper[5127]: E0201 06:48:10.236365 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.250819 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.263445 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.282137 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.289279 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.289339 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.289352 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.289368 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.289379 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.296475 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.312197 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.327741 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.341209 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.353107 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.374186 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.387272 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.392462 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.392572 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.392664 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.392727 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.392780 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.420314 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.439148 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.460127 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.480692 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.495692 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.495745 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.495764 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.495787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.495804 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.497867 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.517371 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.599201 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.599271 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.599297 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.599327 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.599347 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.702690 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.702743 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.702759 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.702785 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.702807 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.805100 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.805144 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.805155 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.805172 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.805184 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.907762 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.907840 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.907852 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.907870 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:10 crc kubenswrapper[5127]: I0201 06:48:10.907881 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:10Z","lastTransitionTime":"2026-02-01T06:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.010243 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.010283 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.010292 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.010305 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.010316 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.112726 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.113147 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.113334 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.113510 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.113694 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.194382 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 11:59:41.104412765 +0000 UTC Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.216897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.216951 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.216964 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.216982 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.217347 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.319650 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.319685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.319696 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.319712 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.319723 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.422129 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.422199 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.422217 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.422241 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.422258 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.525013 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.525082 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.525107 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.525141 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.525161 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.628565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.628647 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.628664 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.628685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.628701 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.731127 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.731190 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.731209 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.731234 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.731252 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.834303 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.834409 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.834431 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.834457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.834474 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.938356 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.938428 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.938449 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.938479 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:11 crc kubenswrapper[5127]: I0201 06:48:11.938503 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:11Z","lastTransitionTime":"2026-02-01T06:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.040715 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.040788 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.040812 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.040841 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.040862 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.143538 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.143619 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.143638 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.143661 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.143677 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.195365 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:11:39.036344462 +0000 UTC Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.234790 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.234880 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.234900 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.234907 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:12 crc kubenswrapper[5127]: E0201 06:48:12.235054 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:12 crc kubenswrapper[5127]: E0201 06:48:12.235166 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:12 crc kubenswrapper[5127]: E0201 06:48:12.235291 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:12 crc kubenswrapper[5127]: E0201 06:48:12.235480 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.246930 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.247001 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.247026 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.247058 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.247081 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.350908 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.350974 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.350994 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.351019 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.351039 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.453327 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.453364 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.453374 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.453392 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.453405 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.555687 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.555757 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.555776 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.555808 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.555833 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.646190 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:12 crc kubenswrapper[5127]: E0201 06:48:12.646432 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:12 crc kubenswrapper[5127]: E0201 06:48:12.646531 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:48:20.646510141 +0000 UTC m=+51.132412514 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.659067 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.659209 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.659237 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.659312 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.659340 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.763224 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.763297 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.763322 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.763355 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.763378 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.866372 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.866440 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.866458 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.866483 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.866500 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.969685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.969752 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.969770 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.969793 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:12 crc kubenswrapper[5127]: I0201 06:48:12.969811 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:12Z","lastTransitionTime":"2026-02-01T06:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.073335 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.073391 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.073415 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.073445 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.073469 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.177130 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.177217 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.177276 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.177301 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.177353 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.196295 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 21:41:43.157548951 +0000 UTC Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.281140 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.281206 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.281224 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.281246 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.281297 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.384718 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.384771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.384787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.384853 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.384880 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.488128 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.488197 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.488215 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.488241 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.488259 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.591687 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.591764 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.591785 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.591811 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.591832 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.694716 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.694755 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.694764 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.694784 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.694794 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.797394 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.797420 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.797430 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.797442 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.797451 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.900977 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.901043 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.901058 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.901086 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:13 crc kubenswrapper[5127]: I0201 06:48:13.901104 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:13Z","lastTransitionTime":"2026-02-01T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.004352 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.004414 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.004433 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.004456 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.004473 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.107744 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.107806 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.107823 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.107847 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.107863 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.196556 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 10:13:11.09616001 +0000 UTC Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.211229 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.211273 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.211291 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.211311 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.211326 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.235776 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.235888 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:14 crc kubenswrapper[5127]: E0201 06:48:14.235977 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:14 crc kubenswrapper[5127]: E0201 06:48:14.236095 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.236189 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:14 crc kubenswrapper[5127]: E0201 06:48:14.236306 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.236500 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:14 crc kubenswrapper[5127]: E0201 06:48:14.236650 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.315446 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.315515 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.315540 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.315566 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.315615 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.418283 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.418332 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.418351 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.418374 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.418392 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.521871 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.521924 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.521940 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.521962 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.521980 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.625257 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.625318 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.625336 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.625361 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.625380 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.728183 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.728244 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.728261 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.728285 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.728302 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.831859 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.831930 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.831953 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.831984 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.832006 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.934469 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.934535 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.934559 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.934621 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:14 crc kubenswrapper[5127]: I0201 06:48:14.934646 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:14Z","lastTransitionTime":"2026-02-01T06:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.037479 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.037539 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.037556 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.037612 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.037642 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.140497 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.140565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.140627 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.140655 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.140673 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.196956 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 04:01:08.608909065 +0000 UTC Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.243826 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.243879 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.243923 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.243944 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.243960 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.347333 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.347407 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.347434 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.347467 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.347489 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.450327 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.450394 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.450411 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.450434 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.450450 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.553510 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.553557 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.553574 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.553633 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.553656 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.656666 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.656720 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.656737 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.656763 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.656782 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.760154 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.760205 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.760222 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.760248 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.760266 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.862462 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.862514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.862529 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.862551 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.862565 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.966776 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.966858 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.966878 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.966904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:15 crc kubenswrapper[5127]: I0201 06:48:15.966925 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:15Z","lastTransitionTime":"2026-02-01T06:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.070511 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.070573 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.070648 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.070679 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.070703 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.173436 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.173760 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.173792 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.173816 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.173832 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.197895 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 22:15:18.579486988 +0000 UTC Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.235574 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.235619 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.235687 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.236022 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.236178 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.236321 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.236493 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.236829 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.276961 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.277018 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.277036 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.277061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.277080 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.380570 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.380649 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.380668 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.380692 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.380707 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.482890 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.483052 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.483070 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.483095 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.483112 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.585824 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.586133 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.586274 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.586459 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.586647 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.693410 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.693492 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.693517 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.693548 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.693572 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.727307 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.727343 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.727354 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.727369 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.727380 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.745246 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.750658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.750692 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.750704 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.750743 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.750757 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.770999 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.775734 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.775823 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.775839 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.775861 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.775878 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.793849 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.798667 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.798882 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.799069 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.799269 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.799416 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.819452 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.824689 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.824929 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.825078 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.825213 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.825331 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.844454 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:16Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:16 crc kubenswrapper[5127]: E0201 06:48:16.844571 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.846178 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.846238 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.846256 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.846281 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.846298 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.948989 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.949024 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.949035 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.949050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:16 crc kubenswrapper[5127]: I0201 06:48:16.949059 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:16Z","lastTransitionTime":"2026-02-01T06:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.051776 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.052011 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.052092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.052217 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.052310 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.155616 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.155671 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.155687 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.155709 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.155726 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.199231 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 08:32:20.545537637 +0000 UTC Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.259196 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.259261 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.259282 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.259311 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.259332 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.362933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.362992 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.363010 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.363033 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.363050 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.466358 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.466417 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.466434 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.466459 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.466475 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.568771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.568838 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.568864 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.568893 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.568914 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.671459 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.671807 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.671933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.672055 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.672185 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.776245 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.776766 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.776951 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.777152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.777340 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.880290 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.880700 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.880845 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.880970 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.881088 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.984208 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.984271 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.984290 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.984316 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:17 crc kubenswrapper[5127]: I0201 06:48:17.984335 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:17Z","lastTransitionTime":"2026-02-01T06:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.087492 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.087615 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.087634 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.087658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.087676 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.191163 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.191216 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.191229 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.191245 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.191258 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.199617 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 02:54:01.275921544 +0000 UTC Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.274838 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.274850 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.274977 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.278635 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:18 crc kubenswrapper[5127]: E0201 06:48:18.279265 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:18 crc kubenswrapper[5127]: E0201 06:48:18.279395 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:18 crc kubenswrapper[5127]: E0201 06:48:18.279550 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:18 crc kubenswrapper[5127]: E0201 06:48:18.279741 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.282710 5127 scope.go:117] "RemoveContainer" containerID="a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.294870 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.294952 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.294972 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.294992 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.295007 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.398308 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.398360 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.398377 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.398400 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.398419 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.501666 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.501728 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.501751 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.501783 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.501807 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.570372 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/1.log" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.573552 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.574100 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.588167 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.601621 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.603767 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.603831 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.603855 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.603887 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.603911 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.616303 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.626356 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.638687 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.648985 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.664000 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.676119 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.690744 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.701908 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.706351 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.706392 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.706402 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.706418 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.706430 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.731648 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.750046 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.767411 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.783707 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.803101 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.809751 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.809787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.809795 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.809809 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.809818 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.820666 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:18Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.912356 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.912384 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.912392 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.912404 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:18 crc kubenswrapper[5127]: I0201 06:48:18.912412 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:18Z","lastTransitionTime":"2026-02-01T06:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.014666 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.014720 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.014735 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.014756 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.014770 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.120424 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.120502 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.120513 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.120551 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.120774 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.200638 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:00:43.988808674 +0000 UTC Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.224159 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.224207 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.224218 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.224237 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.224249 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.327526 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.327556 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.327593 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.327609 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.327621 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.430486 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.430554 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.430572 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.430629 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.430647 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.533212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.533265 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.533283 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.533308 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.533326 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.581260 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/2.log" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.582504 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/1.log" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.589545 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9" exitCode=1 Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.589665 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.589763 5127 scope.go:117] "RemoveContainer" containerID="a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.592001 5127 scope.go:117] "RemoveContainer" containerID="b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9" Feb 01 06:48:19 crc kubenswrapper[5127]: E0201 06:48:19.592415 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.620678 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.636812 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.636859 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.636875 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.636902 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.636921 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.642752 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.665727 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.685258 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.705653 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.730026 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.740779 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.740846 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.740866 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.740894 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.740912 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.746002 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.765445 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.783779 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.806766 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.827114 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.843484 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.843553 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.843571 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.843635 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.843663 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.847796 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.869190 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.893474 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.910236 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.941688 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.946826 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.946887 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.946904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.946931 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:19 crc kubenswrapper[5127]: I0201 06:48:19.946952 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:19Z","lastTransitionTime":"2026-02-01T06:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.050156 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.050207 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.050223 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.050274 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.050293 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.152791 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.152850 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.152866 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.152892 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.152910 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.201629 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 23:49:43.708228841 +0000 UTC Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.235326 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.235436 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.235431 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.235365 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:20 crc kubenswrapper[5127]: E0201 06:48:20.235654 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:20 crc kubenswrapper[5127]: E0201 06:48:20.235748 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:20 crc kubenswrapper[5127]: E0201 06:48:20.235851 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:20 crc kubenswrapper[5127]: E0201 06:48:20.235964 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.255253 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.255298 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.255313 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.255332 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.255346 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.256874 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.278732 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.298720 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.320769 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.347444 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.358457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.358525 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.358545 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.358575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.358638 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.367068 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.383164 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.399537 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.416328 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.438903 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.458299 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.468622 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.468705 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.468766 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.468802 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.468826 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.477419 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.497647 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.523031 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.536703 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.560559 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a445e81ee671dca1f058f8e487c2b6b2e6218b1749668decf69595d7a3f928ee\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:02Z\\\",\\\"message\\\":\\\"=default: []services.LB{}\\\\nI0201 06:48:02.635199 6661 services_controller.go:453] Built service openshift-multus/multus-admission-controller template LB for network=default: []services.LB{}\\\\nI0201 06:48:02.635206 6661 services_controller.go:454] Service openshift-multus/multus-admission-controller for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0201 06:48:02.635124 6661 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0201 06:48:02.635221 6661 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\", UUID:\\\\\\\"d4efc4a8-c514-4a6b-901c-2953978b50d3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-multus/multus-admission-controller\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-multus/multus-admission-controller_TCP_cluster\\\\\\\"\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.572227 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.572278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.572296 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.572321 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.572338 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.596073 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/2.log" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.603422 5127 scope.go:117] "RemoveContainer" containerID="b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9" Feb 01 06:48:20 crc kubenswrapper[5127]: E0201 06:48:20.603883 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.634114 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.650548 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.671946 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.675381 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.675457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.675482 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.675513 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.675536 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.687546 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.703535 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.720001 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.735264 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:20 crc kubenswrapper[5127]: E0201 06:48:20.735515 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:20 crc kubenswrapper[5127]: E0201 06:48:20.735654 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:48:36.735630981 +0000 UTC m=+67.221533374 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.736304 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.754396 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.773291 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.778133 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.778379 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.778569 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.778838 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.779048 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.794374 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.808770 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.824266 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.837847 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.854846 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.876930 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.882061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.882124 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.882141 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.882168 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.882188 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.896833 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:20Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.984702 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.984752 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.984769 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.984791 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:20 crc kubenswrapper[5127]: I0201 06:48:20.984810 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:20Z","lastTransitionTime":"2026-02-01T06:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.087512 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.087614 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.087633 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.087661 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.087680 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.191241 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.191313 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.191331 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.191358 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.191376 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.202096 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 20:43:26.103505643 +0000 UTC Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.295061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.295137 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.295155 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.295179 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.295198 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.401654 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.401786 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.403204 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.403257 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.403285 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.506162 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.506201 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.506213 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.506230 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.506242 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.607698 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.607727 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.607734 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.607746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.607755 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.711981 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.712084 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.712107 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.712133 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.712165 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.815034 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.815108 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.815128 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.815205 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.815224 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.918221 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.918289 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.918312 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.918341 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:21 crc kubenswrapper[5127]: I0201 06:48:21.918359 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:21Z","lastTransitionTime":"2026-02-01T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.021864 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.021925 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.021943 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.021965 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.021982 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.047924 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.048047 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048192 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:48:54.04804435 +0000 UTC m=+84.533946763 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048253 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048290 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048310 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.048310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048371 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:54.048353148 +0000 UTC m=+84.534255551 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.048419 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.048475 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048489 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048519 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048542 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048552 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048631 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:54.048617655 +0000 UTC m=+84.534520038 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048665 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:54.048640426 +0000 UTC m=+84.534542839 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048747 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.048857 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:48:54.048838981 +0000 UTC m=+84.534741344 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.125524 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.125575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.125620 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.125644 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.125661 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.203319 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 08:33:50.505475712 +0000 UTC Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.228961 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.229030 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.229050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.229079 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.229099 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.235004 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.235024 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.235119 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.235180 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.235222 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.235317 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.235561 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:22 crc kubenswrapper[5127]: E0201 06:48:22.235668 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.331323 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.331388 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.331406 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.331431 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.331451 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.434508 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.434565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.434624 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.434649 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.434666 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.537645 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.537712 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.537730 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.537754 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.537803 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.640768 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.640833 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.640856 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.640886 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.640909 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.743743 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.743805 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.743821 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.743843 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.743860 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.846050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.846140 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.846153 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.846169 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.846186 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.948994 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.949074 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.949098 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.949134 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:22 crc kubenswrapper[5127]: I0201 06:48:22.949156 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:22Z","lastTransitionTime":"2026-02-01T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.052263 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.052327 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.052343 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.052366 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.052386 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.156433 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.156495 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.156513 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.156535 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.156552 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.203914 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 13:42:15.042301947 +0000 UTC Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.259808 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.259901 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.259954 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.259978 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.259995 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.363073 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.363145 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.363168 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.363194 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.363241 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.466983 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.467044 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.467061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.467088 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.467105 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.571231 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.571308 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.571329 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.571353 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.571371 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.675234 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.675269 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.675284 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.675317 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.675342 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.778362 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.778438 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.778459 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.778487 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.778507 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.881262 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.881333 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.881350 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.881373 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.881389 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.983732 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.983792 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.983804 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.983823 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:23 crc kubenswrapper[5127]: I0201 06:48:23.983838 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:23Z","lastTransitionTime":"2026-02-01T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.086833 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.086897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.086915 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.086940 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.086958 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.190396 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.190457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.190475 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.190500 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.190517 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.204207 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 07:01:09.999369052 +0000 UTC Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.234633 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.234678 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:24 crc kubenswrapper[5127]: E0201 06:48:24.234836 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.234886 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.234957 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:24 crc kubenswrapper[5127]: E0201 06:48:24.235176 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:24 crc kubenswrapper[5127]: E0201 06:48:24.235321 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:24 crc kubenswrapper[5127]: E0201 06:48:24.235432 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.293841 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.293903 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.293925 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.293957 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.293981 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.397419 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.397478 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.397495 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.397517 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.397536 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.500199 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.500250 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.500266 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.500289 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.500307 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.603050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.603126 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.603148 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.603176 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.603200 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.706329 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.706385 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.706404 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.706426 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.706442 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.809664 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.809724 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.809746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.809776 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.809799 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.912206 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.912272 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.912289 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.912315 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:24 crc kubenswrapper[5127]: I0201 06:48:24.912334 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:24Z","lastTransitionTime":"2026-02-01T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.015095 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.015159 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.015177 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.015203 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.015226 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.118378 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.118447 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.118472 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.118500 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.118516 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.205073 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 08:52:12.775671144 +0000 UTC Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.221029 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.221087 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.221104 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.221128 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.221145 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.324439 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.324513 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.324534 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.324561 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.324615 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.427609 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.427674 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.427690 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.427717 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.427738 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.530808 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.530864 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.530881 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.530903 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.530918 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.574558 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.588186 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.599296 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.625997 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.634224 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.634283 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.634299 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.634317 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.634330 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.643317 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.657678 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.672295 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.693419 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.713944 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.732616 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.737876 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.737922 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.737940 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.737966 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.737983 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.747678 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.764792 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.782360 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.801460 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.826748 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.840991 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.841041 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.841056 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.841076 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.841092 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.845890 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.876767 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.896437 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:25Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.944349 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.944430 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.944448 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.944481 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:25 crc kubenswrapper[5127]: I0201 06:48:25.944501 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:25Z","lastTransitionTime":"2026-02-01T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.048216 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.048279 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.048295 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.048320 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.048335 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.151594 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.151648 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.151658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.151680 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.151694 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.205666 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 05:10:33.661917655 +0000 UTC Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.235190 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.235273 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.235283 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.235354 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:26 crc kubenswrapper[5127]: E0201 06:48:26.235509 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:26 crc kubenswrapper[5127]: E0201 06:48:26.235640 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:26 crc kubenswrapper[5127]: E0201 06:48:26.235992 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:26 crc kubenswrapper[5127]: E0201 06:48:26.236043 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.254728 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.254790 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.254803 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.254825 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.254838 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.358847 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.358900 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.358921 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.358946 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.358961 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.464883 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.464957 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.464983 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.465022 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.465043 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.568158 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.568218 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.568231 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.568248 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.568261 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.671034 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.671161 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.671172 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.671191 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.671201 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.774385 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.774449 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.774470 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.774495 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.774513 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.877707 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.877787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.877812 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.877848 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.877875 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.906543 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.906628 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.906650 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.906681 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.906705 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: E0201 06:48:26.927148 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.933071 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.933140 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.933158 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.933183 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.933200 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: E0201 06:48:26.960282 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.966884 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.966943 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.966960 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.966991 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.967009 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:26 crc kubenswrapper[5127]: E0201 06:48:26.988712 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:26Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.993391 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.993500 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.993526 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.993632 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:26 crc kubenswrapper[5127]: I0201 06:48:26.993651 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:26Z","lastTransitionTime":"2026-02-01T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: E0201 06:48:27.014012 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:27Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.019551 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.019644 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.019661 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.019718 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.019737 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: E0201 06:48:27.045729 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:27Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:27 crc kubenswrapper[5127]: E0201 06:48:27.045918 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.047994 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.048093 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.048117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.048147 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.048165 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.151130 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.151183 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.151200 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.151224 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.151240 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.206458 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 15:40:09.313075186 +0000 UTC Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.254126 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.254164 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.254173 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.254187 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.254198 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.357700 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.357772 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.357789 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.357814 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.357830 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.460722 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.460790 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.460806 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.460827 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.460842 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.563333 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.563377 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.563387 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.563404 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.563415 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.665572 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.665628 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.665636 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.665650 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.665658 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.768746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.768833 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.768850 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.768881 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.768904 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.871363 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.871410 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.871422 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.871436 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.871447 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.974892 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.974977 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.975001 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.975032 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:27 crc kubenswrapper[5127]: I0201 06:48:27.975057 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:27Z","lastTransitionTime":"2026-02-01T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.078120 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.078247 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.078261 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.078280 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.078291 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.181015 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.181060 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.181074 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.181093 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.181107 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.207561 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 23:59:18.615580204 +0000 UTC Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.234973 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.235130 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.235191 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:28 crc kubenswrapper[5127]: E0201 06:48:28.235214 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:28 crc kubenswrapper[5127]: E0201 06:48:28.235491 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.235552 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:28 crc kubenswrapper[5127]: E0201 06:48:28.235845 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:28 crc kubenswrapper[5127]: E0201 06:48:28.235638 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.283809 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.283877 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.283900 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.283930 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.283955 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.387455 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.387794 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.387889 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.387983 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.388088 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.490917 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.490980 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.490995 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.491018 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.491034 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.593303 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.593409 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.593435 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.593468 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.593492 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.696638 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.696705 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.696728 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.696756 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.696775 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.799185 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.799236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.799249 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.799269 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.799281 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.902787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.902862 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.902880 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.902903 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:28 crc kubenswrapper[5127]: I0201 06:48:28.902919 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:28Z","lastTransitionTime":"2026-02-01T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.005760 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.006063 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.006456 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.006787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.006857 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.109740 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.109813 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.109830 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.109855 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.109871 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.208059 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 06:28:13.837906529 +0000 UTC Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.212737 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.213076 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.213263 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.213426 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.213612 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.315963 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.316028 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.316047 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.316071 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.316088 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.419486 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.419636 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.419656 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.419685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.419706 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.522491 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.522616 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.522642 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.522668 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.522685 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.625622 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.625695 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.625715 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.625743 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.625761 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.729038 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.729086 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.729100 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.729118 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.729130 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.832695 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.832779 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.832796 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.832824 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.832844 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.936274 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.936329 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.936342 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.936363 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:29 crc kubenswrapper[5127]: I0201 06:48:29.936377 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:29Z","lastTransitionTime":"2026-02-01T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.039977 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.040055 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.040073 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.040105 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.040121 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.143628 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.143703 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.143724 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.143751 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.143768 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.208288 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 19:30:35.298019819 +0000 UTC Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.234607 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.234663 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.234699 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.234801 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:30 crc kubenswrapper[5127]: E0201 06:48:30.234804 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:30 crc kubenswrapper[5127]: E0201 06:48:30.234917 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:30 crc kubenswrapper[5127]: E0201 06:48:30.235046 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:30 crc kubenswrapper[5127]: E0201 06:48:30.235101 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.254351 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.254976 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.254992 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.255024 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.255039 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.257452 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.275226 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.292274 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.304281 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.343621 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.361804 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.361849 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.361861 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.361883 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.361897 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.393767 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.412419 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.427297 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.438745 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.452316 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.464065 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.464132 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.464151 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.464178 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.464195 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.466045 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.478949 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.488665 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.499906 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.516094 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.530020 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.544729 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:30Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.565802 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.565846 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.565856 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.565871 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.565881 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.668345 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.668443 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.668461 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.668487 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.668504 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.771959 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.772046 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.772070 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.772108 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.772131 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.875671 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.875745 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.875764 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.875792 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.875810 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.978298 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.978357 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.978378 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.978403 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:30 crc kubenswrapper[5127]: I0201 06:48:30.978421 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:30Z","lastTransitionTime":"2026-02-01T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.080753 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.080821 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.080839 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.080863 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.080882 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.183601 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.183638 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.183650 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.183668 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.183683 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.209259 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 02:19:18.549398897 +0000 UTC Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.286325 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.286364 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.286373 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.286384 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.286393 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.389777 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.389865 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.389890 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.389921 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.389945 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.494007 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.494053 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.494068 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.494092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.494109 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.597628 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.597695 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.597717 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.597746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.597768 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.701792 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.701848 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.701864 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.701886 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.701902 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.805053 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.805336 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.805477 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.805678 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.805819 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.909679 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.909742 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.909764 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.909802 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:31 crc kubenswrapper[5127]: I0201 06:48:31.909822 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:31Z","lastTransitionTime":"2026-02-01T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.013290 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.013356 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.013373 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.013400 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.013418 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.116313 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.116367 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.116384 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.116408 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.116424 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.210047 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 18:17:07.215169923 +0000 UTC Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.219539 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.219629 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.219648 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.219674 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.219693 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.234611 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.234662 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.234666 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:32 crc kubenswrapper[5127]: E0201 06:48:32.234755 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:32 crc kubenswrapper[5127]: E0201 06:48:32.234897 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.234996 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:32 crc kubenswrapper[5127]: E0201 06:48:32.235184 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:32 crc kubenswrapper[5127]: E0201 06:48:32.235308 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.321897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.322344 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.322476 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.322638 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.322752 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.425977 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.426034 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.426051 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.426074 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.426089 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.528821 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.528889 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.528906 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.528935 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.528955 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.632316 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.632373 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.632390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.632415 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.632432 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.735486 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.735532 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.735548 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.735571 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.735635 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.837954 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.837991 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.838001 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.838016 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.838028 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.940794 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.940845 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.940865 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.940889 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:32 crc kubenswrapper[5127]: I0201 06:48:32.940907 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:32Z","lastTransitionTime":"2026-02-01T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.045069 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.045160 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.045187 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.045220 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.045244 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.148295 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.148339 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.148350 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.148364 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.148374 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.210894 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 06:00:02.533919218 +0000 UTC Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.250714 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.250742 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.250750 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.250762 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.250770 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.353725 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.353760 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.353771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.353788 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.353801 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.456532 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.456572 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.456604 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.456620 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.456654 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.559062 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.559118 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.559135 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.559159 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.559177 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.662009 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.662055 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.662071 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.662094 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.662110 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.764751 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.764819 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.764842 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.764874 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.764893 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.866944 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.867019 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.867036 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.867063 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.867081 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.970494 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.970625 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.970644 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.970666 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:33 crc kubenswrapper[5127]: I0201 06:48:33.970683 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:33Z","lastTransitionTime":"2026-02-01T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.073477 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.073541 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.073563 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.073626 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.073649 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.176906 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.176979 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.177002 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.177036 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.177057 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.211652 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 05:37:18.194266824 +0000 UTC Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.240831 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:34 crc kubenswrapper[5127]: E0201 06:48:34.241051 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.241725 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:34 crc kubenswrapper[5127]: E0201 06:48:34.241884 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.241979 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:34 crc kubenswrapper[5127]: E0201 06:48:34.242125 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.242220 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:34 crc kubenswrapper[5127]: E0201 06:48:34.242337 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.280106 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.280169 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.280186 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.280209 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.280227 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.382851 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.382906 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.382927 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.382952 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.382972 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.485402 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.485697 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.485784 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.485865 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.485985 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.589009 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.589071 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.589089 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.589112 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.589129 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.691365 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.691449 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.691468 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.691495 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.691512 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.794873 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.794951 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.794969 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.794996 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.795015 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.898313 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.898369 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.898387 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.898412 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:34 crc kubenswrapper[5127]: I0201 06:48:34.898428 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:34Z","lastTransitionTime":"2026-02-01T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.001955 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.002035 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.002058 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.002087 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.002111 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.104525 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.104607 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.104621 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.104638 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.104648 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.207105 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.207148 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.207159 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.207176 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.207188 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.212193 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 02:20:48.305020704 +0000 UTC Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.235250 5127 scope.go:117] "RemoveContainer" containerID="b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9" Feb 01 06:48:35 crc kubenswrapper[5127]: E0201 06:48:35.235431 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.310376 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.310428 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.310440 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.310458 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.310471 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.412599 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.412632 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.412645 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.412659 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.412669 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.515143 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.515205 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.515220 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.515240 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.515257 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.617217 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.617252 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.617260 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.617272 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.617281 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.720213 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.720246 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.720254 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.720269 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.720278 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.823142 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.823205 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.823224 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.823248 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.823264 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.925832 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.925894 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.925912 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.925937 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:35 crc kubenswrapper[5127]: I0201 06:48:35.925958 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:35Z","lastTransitionTime":"2026-02-01T06:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.028211 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.028239 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.028256 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.028269 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.028279 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.130226 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.130260 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.130271 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.130287 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.130297 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.212686 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 18:04:12.880922242 +0000 UTC Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.232880 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.232928 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.232946 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.232971 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.232987 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.235844 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.235950 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.236146 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:36 crc kubenswrapper[5127]: E0201 06:48:36.236217 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.236245 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:36 crc kubenswrapper[5127]: E0201 06:48:36.236305 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:36 crc kubenswrapper[5127]: E0201 06:48:36.236332 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:36 crc kubenswrapper[5127]: E0201 06:48:36.236635 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.336357 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.336406 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.336414 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.336430 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.336441 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.439054 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.439100 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.439109 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.439125 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.439134 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.541603 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.541638 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.541645 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.541657 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.541666 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.644192 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.644236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.644244 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.644259 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.644268 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.746882 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.746945 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.746962 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.746992 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.747010 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.795503 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:36 crc kubenswrapper[5127]: E0201 06:48:36.795714 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:36 crc kubenswrapper[5127]: E0201 06:48:36.795772 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:49:08.795754629 +0000 UTC m=+99.281656992 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.849253 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.849299 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.849306 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.849323 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.849332 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.952375 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.952421 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.952429 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.952447 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:36 crc kubenswrapper[5127]: I0201 06:48:36.952457 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:36Z","lastTransitionTime":"2026-02-01T06:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.054625 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.054666 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.054674 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.054688 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.054697 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.157659 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.157734 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.157757 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.157787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.157816 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.213672 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 04:33:01.613761901 +0000 UTC Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.260660 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.260722 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.260747 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.260776 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.260797 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.332136 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.332159 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.332170 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.332181 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.332208 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: E0201 06:48:37.346687 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:37Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.350798 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.350840 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.350854 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.350871 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.350884 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: E0201 06:48:37.360753 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:37Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.363793 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.363894 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.363958 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.364020 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.364091 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: E0201 06:48:37.376149 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:37Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.379366 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.379455 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.379522 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.379599 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.379664 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: E0201 06:48:37.389813 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:37Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.393490 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.393548 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.393565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.393620 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.393641 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: E0201 06:48:37.405003 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:37Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:37 crc kubenswrapper[5127]: E0201 06:48:37.405310 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.406557 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.406669 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.406729 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.406784 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.406845 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.508976 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.509043 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.509063 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.509088 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.509107 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.611658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.611815 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.611874 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.611933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.611991 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.714009 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.714041 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.714050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.714063 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.714074 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.817107 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.817130 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.817138 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.817150 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.817158 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.919320 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.919376 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.919390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.919406 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:37 crc kubenswrapper[5127]: I0201 06:48:37.919428 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:37Z","lastTransitionTime":"2026-02-01T06:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.021497 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.021522 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.021533 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.021546 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.021556 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.123481 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.123658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.123729 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.123805 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.123866 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.214212 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 03:11:06.428523549 +0000 UTC Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.225914 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.225951 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.225961 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.225975 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.225988 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.235358 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.235387 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.235494 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:38 crc kubenswrapper[5127]: E0201 06:48:38.235535 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:38 crc kubenswrapper[5127]: E0201 06:48:38.235695 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:38 crc kubenswrapper[5127]: E0201 06:48:38.235787 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.235713 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:38 crc kubenswrapper[5127]: E0201 06:48:38.236021 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.328970 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.329050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.329101 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.329116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.329142 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.433017 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.433183 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.433196 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.433221 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.433236 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.536095 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.536124 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.536133 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.536148 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.536157 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.638630 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.638677 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.638689 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.638705 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.638717 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.663162 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/0.log" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.663212 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d959741-37e1-43e7-9ef6-5f33433f9447" containerID="0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436" exitCode=1 Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.663248 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerDied","Data":"0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.663775 5127 scope.go:117] "RemoveContainer" containerID="0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.677789 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.692046 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.704971 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.717036 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.731226 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.741455 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.741485 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.741493 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.741525 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.741539 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.746232 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.764375 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.779435 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.792823 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.802794 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.814980 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.829783 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.842825 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.844010 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.844055 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.844067 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.844084 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.844097 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.856220 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.872352 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.884050 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.907492 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:38Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.946451 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.946482 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.946492 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.946505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:38 crc kubenswrapper[5127]: I0201 06:48:38.946515 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:38Z","lastTransitionTime":"2026-02-01T06:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.048664 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.048699 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.048711 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.048727 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.048737 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.150895 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.150925 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.150933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.150948 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.150956 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.215264 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 09:47:38.198990374 +0000 UTC Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.254308 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.254346 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.254354 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.254372 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.254382 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.357537 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.357572 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.357598 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.357612 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.357621 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.460556 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.460612 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.460621 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.460635 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.460644 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.563802 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.563840 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.563849 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.563862 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.563873 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.666332 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.667499 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.667746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.667994 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.668171 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.668545 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/0.log" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.668617 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerStarted","Data":"a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.682160 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.709924 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.726470 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.742808 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.759199 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.770467 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.770505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.770515 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.770530 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.770547 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.781397 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.797488 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.818031 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.835348 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.852982 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.873643 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.873695 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.873708 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.873727 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.873741 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.874828 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.896215 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.914677 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.927813 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.945156 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.967945 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.977478 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.977703 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.977735 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.977756 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.977769 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:39Z","lastTransitionTime":"2026-02-01T06:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:39 crc kubenswrapper[5127]: I0201 06:48:39.991790 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:39Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.080476 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.080849 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.080997 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.081131 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.081310 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.184236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.184269 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.184277 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.184288 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.184297 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.217055 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 00:26:15.985466446 +0000 UTC Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.234637 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.234687 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.234864 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:40 crc kubenswrapper[5127]: E0201 06:48:40.234881 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:40 crc kubenswrapper[5127]: E0201 06:48:40.234945 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.234982 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:40 crc kubenswrapper[5127]: E0201 06:48:40.235020 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:40 crc kubenswrapper[5127]: E0201 06:48:40.235059 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.264223 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.281455 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.287425 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.287482 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.287505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.287535 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.287559 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.298466 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.318110 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.339369 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.355535 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.371510 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.386195 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.390083 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.390105 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.390124 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.390135 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.390144 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.404908 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.421764 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.439242 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.452888 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.469453 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.486189 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.492838 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.493030 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.493117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.493212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.493298 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.504894 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.525100 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.536664 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:40Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.595676 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.595730 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.595745 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.595765 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.595780 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.697561 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.697605 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.697613 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.697627 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.697636 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.800861 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.800894 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.800905 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.800919 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.800930 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.902941 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.902999 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.903016 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.903038 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:40 crc kubenswrapper[5127]: I0201 06:48:40.903055 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:40Z","lastTransitionTime":"2026-02-01T06:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.006087 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.006154 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.006171 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.006194 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.006212 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.108686 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.108724 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.108733 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.108749 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.108758 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.210505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.210550 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.210561 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.210594 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.210607 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.217954 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 00:31:57.579727834 +0000 UTC Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.313482 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.313522 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.313539 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.313557 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.313572 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.415152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.415191 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.415201 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.415218 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.415229 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.517452 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.517507 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.517519 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.517533 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.517545 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.619990 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.620027 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.620038 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.620050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.620059 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.721177 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.721216 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.721226 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.721240 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.721251 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.823507 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.823573 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.823614 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.823641 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.823670 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.925851 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.925903 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.925910 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.925924 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:41 crc kubenswrapper[5127]: I0201 06:48:41.925934 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:41Z","lastTransitionTime":"2026-02-01T06:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.028651 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.028694 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.028705 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.028720 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.028731 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.131127 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.131247 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.131262 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.131278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.131288 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.218526 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 01:52:41.29906686 +0000 UTC Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.233478 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.233542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.233565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.233664 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.233690 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.234884 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.234910 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.234934 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.234883 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:42 crc kubenswrapper[5127]: E0201 06:48:42.234988 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:42 crc kubenswrapper[5127]: E0201 06:48:42.235086 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:42 crc kubenswrapper[5127]: E0201 06:48:42.235150 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:42 crc kubenswrapper[5127]: E0201 06:48:42.235207 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.336493 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.336547 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.336560 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.336596 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.336610 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.438955 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.439020 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.439035 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.439057 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.439071 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.541134 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.541199 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.541218 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.541243 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.541263 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.643515 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.643571 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.643607 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.643625 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.643641 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.746185 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.746248 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.746266 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.746288 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.746306 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.849174 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.849222 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.849238 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.849261 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.849280 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.951818 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.951889 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.951906 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.951932 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:42 crc kubenswrapper[5127]: I0201 06:48:42.951950 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:42Z","lastTransitionTime":"2026-02-01T06:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.059430 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.059495 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.059514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.059540 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.059557 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.162876 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.162924 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.162932 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.162946 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.162957 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.219398 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 19:56:16.948715725 +0000 UTC Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.247301 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.265896 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.265932 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.265943 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.265958 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.265969 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.368867 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.368910 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.368921 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.368938 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.368950 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.471211 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.471249 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.471261 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.471278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.471287 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.573738 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.573772 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.573783 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.573799 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.573810 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.675699 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.675726 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.675736 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.675751 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.675762 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.777191 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.777228 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.777238 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.777255 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.777267 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.879686 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.879722 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.879732 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.879747 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.879757 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.982435 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.982467 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.982478 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.982491 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:43 crc kubenswrapper[5127]: I0201 06:48:43.982502 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:43Z","lastTransitionTime":"2026-02-01T06:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.084250 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.084338 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.084357 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.084379 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.084395 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.187611 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.187655 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.187667 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.187685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.187698 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.220437 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 15:12:30.054858392 +0000 UTC Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.235476 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.235519 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.235624 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.235703 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:44 crc kubenswrapper[5127]: E0201 06:48:44.235712 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:44 crc kubenswrapper[5127]: E0201 06:48:44.235791 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:44 crc kubenswrapper[5127]: E0201 06:48:44.235978 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:44 crc kubenswrapper[5127]: E0201 06:48:44.236192 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.291668 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.291734 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.291758 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.291786 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.291810 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.395328 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.395390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.395412 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.395441 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.395461 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.499267 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.499322 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.499342 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.499365 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.499383 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.603779 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.603859 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.603888 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.603924 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.603952 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.706954 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.706999 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.707011 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.707028 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.707040 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.810326 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.810407 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.810433 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.810467 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.810494 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.913556 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.913655 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.913677 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.913705 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:44 crc kubenswrapper[5127]: I0201 06:48:44.913732 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:44Z","lastTransitionTime":"2026-02-01T06:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.016764 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.016836 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.016854 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.016877 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.016895 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.120137 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.120186 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.120203 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.120226 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.120243 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.220818 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 23:19:33.992683442 +0000 UTC Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.223442 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.223485 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.223502 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.223525 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.223543 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.325787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.325841 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.325858 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.325881 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.325898 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.429003 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.429335 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.429363 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.429389 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.429420 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.534412 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.534507 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.534525 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.534549 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.534565 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.637333 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.637379 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.637389 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.637403 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.637411 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.740871 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.740934 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.740955 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.740984 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.741005 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.844847 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.844906 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.844923 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.844945 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.844961 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.948700 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.948768 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.948793 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.948820 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:45 crc kubenswrapper[5127]: I0201 06:48:45.948842 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:45Z","lastTransitionTime":"2026-02-01T06:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.052235 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.052654 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.052825 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.052987 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.053163 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.156860 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.156925 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.156943 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.156971 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.156989 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.221987 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 02:03:01.06958402 +0000 UTC Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.235393 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:46 crc kubenswrapper[5127]: E0201 06:48:46.235558 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.235955 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.235973 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.236046 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:46 crc kubenswrapper[5127]: E0201 06:48:46.236093 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:46 crc kubenswrapper[5127]: E0201 06:48:46.236231 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:46 crc kubenswrapper[5127]: E0201 06:48:46.236319 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.260001 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.260048 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.260068 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.260090 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.260110 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.363157 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.363226 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.363245 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.363268 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.363286 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.466329 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.466388 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.466405 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.466429 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.466447 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.569063 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.569132 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.569149 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.569172 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.569191 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.671815 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.671887 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.671907 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.671933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.671954 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.776127 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.776188 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.776204 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.776227 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.776245 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.880113 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.880163 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.880184 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.880211 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.880232 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.983520 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.983626 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.983653 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.983683 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:46 crc kubenswrapper[5127]: I0201 06:48:46.983704 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:46Z","lastTransitionTime":"2026-02-01T06:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.087336 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.087390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.087407 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.087431 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.087453 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.191759 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.191817 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.191836 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.191860 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.191876 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.223788 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 05:19:02.129512953 +0000 UTC Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.295265 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.295341 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.295364 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.295393 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.295415 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.399056 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.399116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.399134 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.399161 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.399185 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.502231 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.502334 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.502390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.502416 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.502432 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.605163 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.605242 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.605266 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.605294 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.605314 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.707669 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.707737 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.707759 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.707788 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.707829 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.738985 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.739041 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.739062 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.739088 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.739110 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: E0201 06:48:47.766252 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:47Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.772378 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.772459 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.772483 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.772514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.772538 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: E0201 06:48:47.789869 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:47Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.795068 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.795121 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.795139 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.795160 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.795174 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: E0201 06:48:47.814946 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:47Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.819696 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.819794 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.819822 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.819898 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.819922 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: E0201 06:48:47.840397 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:47Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.845257 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.845300 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.845313 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.845332 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.845428 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: E0201 06:48:47.863126 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:47Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:47 crc kubenswrapper[5127]: E0201 06:48:47.863500 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.865371 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.865408 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.865422 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.865438 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.865450 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.970341 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.970402 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.970418 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.970441 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:47 crc kubenswrapper[5127]: I0201 06:48:47.970459 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:47Z","lastTransitionTime":"2026-02-01T06:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.074272 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.074355 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.074380 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.074406 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.074423 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.177366 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.177440 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.177463 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.177491 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.177514 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.224232 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 06:55:46.199321962 +0000 UTC Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.234568 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.234706 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.234602 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:48 crc kubenswrapper[5127]: E0201 06:48:48.234759 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.234937 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:48 crc kubenswrapper[5127]: E0201 06:48:48.234927 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:48 crc kubenswrapper[5127]: E0201 06:48:48.234993 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:48 crc kubenswrapper[5127]: E0201 06:48:48.235062 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.280548 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.280575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.280602 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.280614 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.280622 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.384956 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.385023 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.385046 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.385075 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.385098 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.488418 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.488522 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.488545 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.488608 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.488630 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.592813 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.592878 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.592904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.592935 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.592958 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.694997 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.695052 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.695069 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.695091 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.695109 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.798722 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.798787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.798807 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.798831 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.798848 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.901867 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.901916 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.901927 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.901945 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:48 crc kubenswrapper[5127]: I0201 06:48:48.901957 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:48Z","lastTransitionTime":"2026-02-01T06:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.005112 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.005171 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.005189 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.005211 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.005228 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.107397 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.107458 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.107479 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.107506 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.107532 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.210541 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.210656 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.210675 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.210703 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.210722 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.224941 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 14:53:27.136871266 +0000 UTC Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.313136 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.313177 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.313185 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.313202 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.313211 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.415787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.415820 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.415827 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.415840 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.415849 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.518877 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.518944 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.518963 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.518987 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.519011 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.621876 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.621938 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.621957 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.621982 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.622000 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.725350 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.725417 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.725438 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.725461 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.725479 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.829172 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.829236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.829255 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.829278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.829296 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.933562 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.933664 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.933677 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.933702 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:49 crc kubenswrapper[5127]: I0201 06:48:49.933721 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:49Z","lastTransitionTime":"2026-02-01T06:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.037892 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.037947 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.037959 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.037980 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.037995 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.140807 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.140890 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.140914 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.140939 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.140959 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.226033 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 20:26:51.814289679 +0000 UTC Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.235535 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.235626 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:50 crc kubenswrapper[5127]: E0201 06:48:50.235796 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.235919 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.236013 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:50 crc kubenswrapper[5127]: E0201 06:48:50.236214 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:50 crc kubenswrapper[5127]: E0201 06:48:50.236352 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:50 crc kubenswrapper[5127]: E0201 06:48:50.236504 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.237722 5127 scope.go:117] "RemoveContainer" containerID="b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.248909 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.248957 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.248969 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.248987 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.248999 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.251845 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.270333 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.289629 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.304944 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.336532 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.351967 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.352018 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.352052 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.352073 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.352088 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.355216 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.373656 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.392434 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.413044 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.431813 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.447754 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.454805 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.454833 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.454845 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.454871 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.454882 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.463803 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c5a748-52d4-43a5-8425-140c08dad789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e70fe3ab1894d1c8fa7c60af268a4177bd430373470d1eb0c6c7d85756aa39a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.485925 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.500315 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.519821 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.533788 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.552087 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.557079 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.557123 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.557134 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.557151 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.557162 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.564262 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.659477 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.659533 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.659549 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.659572 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.659616 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.703662 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/2.log" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.707225 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.707890 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.727731 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.749667 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.762734 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.762800 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.762812 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.762850 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.762864 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.769695 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.780780 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.791755 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.803335 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.815448 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c5a748-52d4-43a5-8425-140c08dad789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e70fe3ab1894d1c8fa7c60af268a4177bd430373470d1eb0c6c7d85756aa39a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.831222 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.851683 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.865470 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.865692 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.865798 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.865894 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.865971 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.866063 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.891178 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.911613 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.926316 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.944231 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.958801 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.968751 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.968956 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.969051 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.969142 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.969270 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:50Z","lastTransitionTime":"2026-02-01T06:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.974708 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:50 crc kubenswrapper[5127]: I0201 06:48:50.993422 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:50Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.008694 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.072203 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.072257 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.072268 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.072291 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.072306 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.175826 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.175879 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.175897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.175920 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.175936 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.227101 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 04:33:24.834247921 +0000 UTC Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.279068 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.279303 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.279314 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.279331 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.279343 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.383117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.383171 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.383187 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.383210 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.383229 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.486505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.486858 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.487040 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.487171 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.487293 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.590208 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.590571 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.590808 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.591013 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.591189 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.693765 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.693831 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.693851 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.693878 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.693898 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.713046 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/3.log" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.714020 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/2.log" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.717664 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" exitCode=1 Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.717727 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.717780 5127 scope.go:117] "RemoveContainer" containerID="b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.722038 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 06:48:51 crc kubenswrapper[5127]: E0201 06:48:51.722247 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.733718 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.756725 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.777924 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.798067 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.798117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.798130 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.798149 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.798160 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.802389 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f875628f371f699b01476b929793695b9d99c577f011004821ab60afd2b2d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:19Z\\\",\\\"message\\\":\\\"ables-alerter-4ln5h after 0 failed attempt(s)\\\\nI0201 06:48:19.376103 6875 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-cmdjj\\\\nI0201 06:48:19.376108 6875 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI0201 06:48:19.376106 6875 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nF0201 06:48:19.376073 6875 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:19Z is after 2025-08-24T17:21:41Z]\\\\nI0201 06:48:19.376116 6875 obj_retry.go:303] Retry object s\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:51Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:48:51.236836 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:48:51.236903 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 06:48:51.236959 7270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 06:48:51.237065 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:51.237084 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:51.237147 7270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 06:48:51.237178 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 06:48:51.237217 7270 factory.go:656] Stopping watch factory\\\\nI0201 06:48:51.237241 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:48:51.237276 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:48:51.237299 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 06:48:51.237313 7270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 06:48:51.237328 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:51.237342 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 06:48:51.237355 7270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 06:48:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.818323 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.831889 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.852794 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.871863 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.891627 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.901478 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.901543 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.901560 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.901618 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.901640 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:51Z","lastTransitionTime":"2026-02-01T06:48:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.912303 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.929093 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.950165 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.967901 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:51 crc kubenswrapper[5127]: I0201 06:48:51.989804 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:51Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.004207 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.004281 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.004336 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.004368 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.004391 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.006056 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.021986 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.039110 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.054701 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c5a748-52d4-43a5-8425-140c08dad789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e70fe3ab1894d1c8fa7c60af268a4177bd430373470d1eb0c6c7d85756aa39a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.107098 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.107162 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.107185 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.107213 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.107234 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.210493 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.210574 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.210634 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.210670 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.210692 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.228939 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 14:07:37.71855372 +0000 UTC Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.235296 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.235329 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.235403 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:52 crc kubenswrapper[5127]: E0201 06:48:52.235487 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.235510 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:52 crc kubenswrapper[5127]: E0201 06:48:52.235678 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:52 crc kubenswrapper[5127]: E0201 06:48:52.235867 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:52 crc kubenswrapper[5127]: E0201 06:48:52.236010 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.314265 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.314326 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.314344 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.314468 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.314490 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.418195 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.418275 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.418290 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.418317 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.418333 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.520790 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.520849 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.520867 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.520890 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.520908 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.623493 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.623563 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.623648 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.623683 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.623709 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.723997 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/3.log" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.725854 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.725910 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.725927 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.725950 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.725968 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.731222 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 06:48:52 crc kubenswrapper[5127]: E0201 06:48:52.731802 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.752514 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.773525 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.799313 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.817644 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.829691 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.829787 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.829811 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.829851 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.829873 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.852842 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:51Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:48:51.236836 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:48:51.236903 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 06:48:51.236959 7270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 06:48:51.237065 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:51.237084 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:51.237147 7270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 06:48:51.237178 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 06:48:51.237217 7270 factory.go:656] Stopping watch factory\\\\nI0201 06:48:51.237241 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:48:51.237276 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:48:51.237299 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 06:48:51.237313 7270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 06:48:51.237328 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:51.237342 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 06:48:51.237355 7270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 06:48:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.873419 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.897394 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.918715 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.934303 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.934356 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.934376 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.934403 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.934420 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:52Z","lastTransitionTime":"2026-02-01T06:48:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.939800 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.960655 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:52 crc kubenswrapper[5127]: I0201 06:48:52.987948 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:52Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.005915 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c5a748-52d4-43a5-8425-140c08dad789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e70fe3ab1894d1c8fa7c60af268a4177bd430373470d1eb0c6c7d85756aa39a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.028191 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.037609 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.037648 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.037660 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.037680 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.037694 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.050873 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.072346 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.089392 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.108504 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.126941 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:53Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.140806 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.140858 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.140876 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.140904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.140922 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.229755 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 19:35:51.02988077 +0000 UTC Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.243760 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.243811 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.243834 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.243860 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.243882 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.346881 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.347210 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.347304 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.347411 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.347499 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.450768 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.451181 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.451322 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.451465 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.451682 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.555883 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.555976 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.555995 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.556020 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.556037 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.659273 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.659338 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.659359 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.659389 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.659408 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.762183 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.762269 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.762287 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.762326 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.762347 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.864923 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.864982 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.865000 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.865024 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.865044 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.985374 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.985453 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.985474 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.985508 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:53 crc kubenswrapper[5127]: I0201 06:48:53.985529 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:53Z","lastTransitionTime":"2026-02-01T06:48:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.088161 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.088643 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.088667 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.088709 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.088731 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.090825 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.091002 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.091048 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091077 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.091044543 +0000 UTC m=+148.576946946 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.091141 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091156 5127 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091227 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.091205028 +0000 UTC m=+148.577107401 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091243 5127 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.091289 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091351 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.091323491 +0000 UTC m=+148.577225894 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091483 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091501 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091523 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091532 5127 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091544 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091550 5127 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091644 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.091628909 +0000 UTC m=+148.577531302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.091673 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.09166073 +0000 UTC m=+148.577563123 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.191863 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.191929 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.191946 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.191974 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.191991 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.229959 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 11:02:43.496924268 +0000 UTC Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.235662 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.235822 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.235993 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.236146 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.236429 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.236526 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.237048 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:54 crc kubenswrapper[5127]: E0201 06:48:54.237209 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.259209 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.295434 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.296172 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.296285 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.296379 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.296462 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.399854 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.399911 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.399930 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.399955 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.399972 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.503374 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.503453 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.503471 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.503499 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.503516 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.607453 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.607826 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.607967 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.608111 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.608234 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.712795 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.713245 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.713470 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.713770 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.713970 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.816801 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.816859 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.816874 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.816897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.816913 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.920076 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.920153 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.920170 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.920195 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:54 crc kubenswrapper[5127]: I0201 06:48:54.920213 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:54Z","lastTransitionTime":"2026-02-01T06:48:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.023348 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.023424 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.023448 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.023483 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.023507 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.130746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.130828 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.130855 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.130889 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.130923 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.230495 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 21:57:37.877437776 +0000 UTC Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.234124 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.234194 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.234217 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.234248 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.234270 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.337627 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.337684 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.337699 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.337722 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.337739 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.441412 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.441525 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.441542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.441571 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.441660 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.544774 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.545001 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.545021 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.545045 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.545062 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.650244 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.650302 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.650314 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.650338 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.650351 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.753451 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.753522 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.753530 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.753544 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.753553 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.856951 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.856990 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.857004 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.857025 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.857038 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.960061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.960099 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.960112 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.960126 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:55 crc kubenswrapper[5127]: I0201 06:48:55.960137 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:55Z","lastTransitionTime":"2026-02-01T06:48:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.063669 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.063745 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.063769 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.063797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.063815 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.167212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.167280 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.167299 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.167323 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.167341 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.231459 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 15:25:34.169664272 +0000 UTC Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.234831 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.234905 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.235005 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:56 crc kubenswrapper[5127]: E0201 06:48:56.234997 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:56 crc kubenswrapper[5127]: E0201 06:48:56.235164 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:56 crc kubenswrapper[5127]: E0201 06:48:56.235258 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.235401 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:56 crc kubenswrapper[5127]: E0201 06:48:56.235506 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.270458 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.270512 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.270529 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.270554 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.270573 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.372396 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.372444 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.372459 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.372477 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.372494 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.475679 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.475732 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.475755 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.475783 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.475807 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.579439 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.579501 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.579519 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.579542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.579558 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.681904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.681963 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.681982 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.682004 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.682019 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.785568 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.785682 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.785704 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.785727 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.785745 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.888846 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.888897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.888909 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.888928 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.888943 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.992258 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.992342 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.992385 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.992417 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:56 crc kubenswrapper[5127]: I0201 06:48:56.992440 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:56Z","lastTransitionTime":"2026-02-01T06:48:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.095134 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.095195 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.095212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.095240 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.095259 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.199542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.199864 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.199995 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.200033 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.200091 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.232165 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 16:59:18.864377369 +0000 UTC Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.303314 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.303376 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.303393 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.303418 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.303437 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.407390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.407438 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.407447 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.407462 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.407474 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.510481 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.510555 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.510574 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.510681 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.510712 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.614950 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.615006 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.615020 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.615040 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.615054 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.718360 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.718436 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.718446 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.718461 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.718472 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.822416 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.822613 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.822637 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.822667 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.822687 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.925749 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.925817 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.925839 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.925868 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:57 crc kubenswrapper[5127]: I0201 06:48:57.925887 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:57Z","lastTransitionTime":"2026-02-01T06:48:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.030128 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.030201 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.030223 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.030256 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.030283 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.133312 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.133392 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.133415 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.133447 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.133473 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.232776 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 03:13:30.977018762 +0000 UTC Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.235265 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.235378 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.235514 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.235674 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.235847 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.235961 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.236109 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.236357 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.237943 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.238039 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.238068 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.238102 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.238128 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.260090 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.260127 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.260140 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.260155 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.260167 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.276224 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.280432 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.280499 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.280516 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.280542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.280611 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.300267 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.305502 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.305664 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.305688 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.305714 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.305736 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.325635 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.330982 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.331045 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.331064 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.331092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.331115 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.350410 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.355062 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.355117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.355137 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.355163 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.355181 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.375087 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:48:58Z is after 2025-08-24T17:21:41Z" Feb 01 06:48:58 crc kubenswrapper[5127]: E0201 06:48:58.375335 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.378096 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.378179 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.378206 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.378241 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.378276 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.482342 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.482395 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.482412 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.482440 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.482458 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.585120 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.585171 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.585187 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.585208 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.585219 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.688293 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.688369 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.688393 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.688424 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.688447 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.791967 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.792007 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.792019 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.792036 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.792050 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.894528 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.894570 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.894604 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.894622 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.894635 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.997836 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.997904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.997923 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.997952 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:58 crc kubenswrapper[5127]: I0201 06:48:58.997969 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:58Z","lastTransitionTime":"2026-02-01T06:48:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.101481 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.101557 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.101575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.101627 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.101647 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.204082 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.204120 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.204130 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.204148 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.204165 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.233917 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 13:54:17.244533874 +0000 UTC Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.312335 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.312406 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.312425 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.312450 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.312468 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.416152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.416216 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.416233 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.416258 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.416276 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.518865 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.518897 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.518907 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.518922 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.518933 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.622755 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.622823 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.622847 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.622879 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.622905 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.726147 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.726236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.726252 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.726278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.726299 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.830424 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.830492 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.830510 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.830537 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.830558 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.933825 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.933892 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.933907 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.933933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:48:59 crc kubenswrapper[5127]: I0201 06:48:59.933949 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:48:59Z","lastTransitionTime":"2026-02-01T06:48:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.036653 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.036719 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.036737 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.036763 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.036782 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.139972 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.140030 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.140049 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.140069 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.140084 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.234455 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 00:51:07.306826273 +0000 UTC Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.234718 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.234767 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.234825 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:00 crc kubenswrapper[5127]: E0201 06:49:00.234918 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.234936 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:00 crc kubenswrapper[5127]: E0201 06:49:00.235116 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:00 crc kubenswrapper[5127]: E0201 06:49:00.235228 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:00 crc kubenswrapper[5127]: E0201 06:49:00.235440 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.241391 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.241434 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.241446 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.241460 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.241473 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.260430 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.278481 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.291634 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.308039 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.320471 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.340163 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c5a748-52d4-43a5-8425-140c08dad789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e70fe3ab1894d1c8fa7c60af268a4177bd430373470d1eb0c6c7d85756aa39a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.343522 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.343563 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.343603 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.343623 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.343637 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.355571 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.373187 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.389475 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.407545 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.419372 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.447533 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.447602 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.447618 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.447640 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.447658 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.447367 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:51Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:48:51.236836 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:48:51.236903 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 06:48:51.236959 7270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 06:48:51.237065 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:51.237084 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:51.237147 7270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 06:48:51.237178 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 06:48:51.237217 7270 factory.go:656] Stopping watch factory\\\\nI0201 06:48:51.237241 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:48:51.237276 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:48:51.237299 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 06:48:51.237313 7270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 06:48:51.237328 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:51.237342 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 06:48:51.237355 7270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 06:48:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.467136 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.487946 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.509217 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.530524 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.547803 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.551040 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.551114 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.551178 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.551206 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.551257 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.567237 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bb61d7-313c-4fc7-b801-632e54ca1a7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e9c0789ea3fba09e6acc8e9548395f6ad0333e9ac67f893c6338090943511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8908ea21cbb09bc49a45bb88c61d337966b1006166e92da4ba4cc50ecfe47568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b1ad49dcaaabd979a0f55208958b957478c7b93ef17cfd770fb6166c65e3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226c1b4a1565d1d3068623995df09dad0197b18855a84fb07f3fa49b393441db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://740ca69e35708127ed32e647e8cd9a203b0696d3508b0fb8876e544983ced563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d783b81865d0ee1c58cd84cfc82b4e54f5eeb4c6390299590da86b1a1c5f7b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d783b81865d0ee1c58cd84cfc82b4e54f5eeb4c6390299590da86b1a1c5f7b21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b0811bc4ca1fcf16fe57a6692ff3c9b62fb94e4e464fb18868389cb55749f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b0811bc4ca1fcf16fe57a6692ff3c9b62fb94e4e464fb18868389cb55749f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1fe9b291c33b77d23d194243ad05329ad869ee3b459f2291503d40f3079515f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1fe9b291c33b77d23d194243ad05329ad869ee3b459f2291503d40f3079515f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.580174 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:00Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.654113 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.654180 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.654204 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.654237 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.654262 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.756968 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.757042 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.757061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.757088 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.757105 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.860650 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.860720 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.860743 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.860773 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.860795 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.964405 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.964474 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.964494 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.964518 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:00 crc kubenswrapper[5127]: I0201 06:49:00.964535 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:00Z","lastTransitionTime":"2026-02-01T06:49:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.067901 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.067942 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.067952 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.067971 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.067984 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.171179 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.171232 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.171245 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.171270 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.171286 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.235162 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 10:39:44.607204734 +0000 UTC Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.274338 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.274370 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.274378 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.274393 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.274401 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.377164 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.377235 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.377253 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.377278 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.377299 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.481707 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.482387 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.482427 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.482457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.482478 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.585925 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.585990 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.586007 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.586045 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.586063 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.688988 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.689054 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.689072 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.689097 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.689114 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.791966 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.792048 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.792067 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.792092 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.792110 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.896448 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.896519 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.896542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.896575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.896660 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.999670 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.999735 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.999751 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.999775 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:01 crc kubenswrapper[5127]: I0201 06:49:01.999792 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:01Z","lastTransitionTime":"2026-02-01T06:49:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.103029 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.103066 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.103075 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.103088 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.103101 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.206212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.206260 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.206277 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.206300 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.206316 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.235135 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.235251 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.235153 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.235321 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 23:37:27.548078078 +0000 UTC Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.235355 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:02 crc kubenswrapper[5127]: E0201 06:49:02.235349 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:02 crc kubenswrapper[5127]: E0201 06:49:02.235438 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:02 crc kubenswrapper[5127]: E0201 06:49:02.235514 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:02 crc kubenswrapper[5127]: E0201 06:49:02.235683 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.308967 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.309024 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.309046 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.309074 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.309096 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.411683 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.411761 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.411778 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.411803 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.411821 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.515274 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.515349 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.515367 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.515390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.515407 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.617728 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.617795 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.617816 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.617846 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.617869 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.720700 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.720759 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.720778 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.720801 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.720818 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.824531 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.824615 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.824632 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.824655 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.824672 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.926784 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.926816 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.926827 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.926841 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:02 crc kubenswrapper[5127]: I0201 06:49:02.926852 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:02Z","lastTransitionTime":"2026-02-01T06:49:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.030994 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.031040 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.031102 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.031122 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.031133 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.133726 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.133767 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.133775 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.133789 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.133799 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.235393 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 07:36:58.934864059 +0000 UTC Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.236681 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.236746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.236767 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.236791 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.236810 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.339888 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.340013 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.340094 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.340124 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.340146 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.443733 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.443778 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.443789 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.443804 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.443814 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.547140 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.547212 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.547235 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.547268 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.547292 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.649786 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.649857 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.649880 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.649908 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.649928 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.752967 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.753028 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.753044 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.753068 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.753087 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.856109 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.856178 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.856196 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.856219 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.856239 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.959739 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.959815 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.959835 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.959860 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:03 crc kubenswrapper[5127]: I0201 06:49:03.959881 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:03Z","lastTransitionTime":"2026-02-01T06:49:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.063538 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.063627 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.063652 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.063679 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.063704 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.166916 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.166973 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.166995 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.167026 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.167051 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.234918 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.234973 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.235091 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:04 crc kubenswrapper[5127]: E0201 06:49:04.235083 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.235199 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:04 crc kubenswrapper[5127]: E0201 06:49:04.235338 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:04 crc kubenswrapper[5127]: E0201 06:49:04.235723 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.235833 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 10:54:06.898069369 +0000 UTC Feb 01 06:49:04 crc kubenswrapper[5127]: E0201 06:49:04.235988 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.270862 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.270909 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.270921 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.270940 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.270952 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.374407 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.374465 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.374481 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.374505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.374523 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.477570 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.477695 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.477712 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.477742 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.477766 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.580888 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.580956 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.580978 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.581008 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.581029 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.683996 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.684098 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.684121 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.684150 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.684176 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.786908 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.786980 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.787002 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.787031 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.787053 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.890552 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.890709 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.890726 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.890749 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.890768 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.993523 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.993656 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.993675 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.993697 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:04 crc kubenswrapper[5127]: I0201 06:49:04.993714 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:04Z","lastTransitionTime":"2026-02-01T06:49:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.097184 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.097266 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.097282 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.097304 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.097321 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.200738 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.200797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.200817 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.200840 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.200857 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.236220 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 06:49:05 crc kubenswrapper[5127]: E0201 06:49:05.236517 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.236637 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:02:00.710479778 +0000 UTC Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.303963 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.304138 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.304156 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.304183 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.304217 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.407980 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.408044 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.408060 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.408086 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.408103 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.510910 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.510969 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.510986 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.511009 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.511026 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.614405 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.614515 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.614534 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.614631 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.614739 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.717911 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.717956 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.717978 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.718005 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.718024 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.821271 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.821318 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.821337 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.821357 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.821374 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.924727 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.924782 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.924798 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.924820 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:05 crc kubenswrapper[5127]: I0201 06:49:05.924838 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:05Z","lastTransitionTime":"2026-02-01T06:49:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.027415 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.027464 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.027481 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.027505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.027523 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.131742 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.131802 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.131819 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.131856 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.131891 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.234563 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:06 crc kubenswrapper[5127]: E0201 06:49:06.234806 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.235192 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:06 crc kubenswrapper[5127]: E0201 06:49:06.235350 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.235833 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.235883 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.235904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.235935 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.235958 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.237200 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:06 crc kubenswrapper[5127]: E0201 06:49:06.237345 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.237637 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 04:31:39.605632247 +0000 UTC Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.237947 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:06 crc kubenswrapper[5127]: E0201 06:49:06.238097 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.338774 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.338835 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.338856 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.338885 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.338906 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.441980 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.442061 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.442085 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.442117 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.442134 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.545688 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.545753 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.545771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.545797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.545816 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.649055 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.649110 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.649130 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.649158 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.649180 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.751817 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.751887 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.751911 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.751940 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.751962 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.854775 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.854830 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.854847 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.854869 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.854886 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.958240 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.958303 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.958323 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.958346 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:06 crc kubenswrapper[5127]: I0201 06:49:06.958363 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:06Z","lastTransitionTime":"2026-02-01T06:49:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.061050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.061121 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.061139 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.061166 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.061184 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.164474 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.164526 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.164542 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.164562 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.164603 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.238747 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 03:53:30.085772478 +0000 UTC Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.267676 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.267735 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.267756 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.267786 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.267807 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.371296 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.371399 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.371423 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.371451 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.371471 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.473320 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.473370 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.473391 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.473416 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.473437 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.576408 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.576477 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.576494 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.576518 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.576535 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.679606 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.679661 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.679678 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.679698 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.679714 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.782678 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.782750 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.782769 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.782792 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.782809 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.886051 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.886112 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.886135 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.886163 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.886185 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.989454 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.989510 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.989526 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.989549 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:07 crc kubenswrapper[5127]: I0201 06:49:07.989565 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:07Z","lastTransitionTime":"2026-02-01T06:49:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.092173 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.092281 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.092300 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.092323 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.092343 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.195815 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.195862 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.195881 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.195906 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.195923 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.234775 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.234989 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.235317 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.235466 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.235796 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.235819 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.235960 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.236056 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.238846 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 02:14:12.568261278 +0000 UTC Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.299144 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.299196 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.299216 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.299239 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.299258 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.460749 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.460794 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.460805 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.460822 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.460833 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.564652 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.564724 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.564744 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.564769 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.564789 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.619052 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.619122 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.619141 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.619169 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.619188 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.635085 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.639674 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.639733 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.639752 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.639778 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.639795 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.660707 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.666054 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.666143 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.666161 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.666181 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.666196 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.685225 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.689686 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.689753 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.689781 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.689814 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.689840 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.707966 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.713307 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.713347 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.713360 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.713404 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.713416 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.731861 5127 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T06:49:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"98b48bbb-d7fc-478c-b553-b66324236dfc\\\",\\\"systemUUID\\\":\\\"ebe07c8f-9946-4616-a1da-f5bf2315344d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:08Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.732083 5127 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.734156 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.734203 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.734216 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.734238 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.734253 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.836610 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.836671 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.836684 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.836698 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.836709 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.877517 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.877771 5127 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:49:08 crc kubenswrapper[5127]: E0201 06:49:08.877876 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs podName:bafc814f-6c41-40cf-b3f4-8babc6ec840a nodeName:}" failed. No retries permitted until 2026-02-01 06:50:12.877849577 +0000 UTC m=+163.363751970 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs") pod "network-metrics-daemon-ls5xc" (UID: "bafc814f-6c41-40cf-b3f4-8babc6ec840a") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.939921 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.939986 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.940004 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.940030 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:08 crc kubenswrapper[5127]: I0201 06:49:08.940046 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:08Z","lastTransitionTime":"2026-02-01T06:49:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.043679 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.043754 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.043775 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.043802 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.043819 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.146984 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.147023 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.147034 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.147050 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.147062 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.239051 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 01:33:09.164341047 +0000 UTC Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.250511 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.250571 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.250636 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.250667 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.250689 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.353983 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.354038 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.354056 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.354079 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.354096 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.456690 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.456742 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.456760 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.456783 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.456800 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.560505 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.560637 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.560662 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.560688 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.560706 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.663474 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.663553 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.663575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.663644 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.663669 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.766773 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.766843 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.766859 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.766884 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.766901 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.870378 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.870429 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.870457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.870499 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.870521 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.974056 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.974113 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.974132 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.974155 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:09 crc kubenswrapper[5127]: I0201 06:49:09.974171 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:09Z","lastTransitionTime":"2026-02-01T06:49:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.077485 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.077551 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.077575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.077640 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.077664 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.180453 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.180504 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.180523 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.180546 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.180564 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.234613 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.234686 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.234616 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:10 crc kubenswrapper[5127]: E0201 06:49:10.235964 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.236047 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:10 crc kubenswrapper[5127]: E0201 06:49:10.236231 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:10 crc kubenswrapper[5127]: E0201 06:49:10.236649 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:10 crc kubenswrapper[5127]: E0201 06:49:10.237536 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.239396 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:12:44.804845454 +0000 UTC Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.253257 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31c5a748-52d4-43a5-8425-140c08dad789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e70fe3ab1894d1c8fa7c60af268a4177bd430373470d1eb0c6c7d85756aa39a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://afa4d0a85aa166d038df1156d87c26c66d0c8bef2cdc77db59a52b5adcc0d179\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.272898 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de21fcd8-f19d-4564-984f-22a1ab77dd82\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T06:47:49Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 06:47:43.989168 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 06:47:43.991525 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1772085155/tls.crt::/tmp/serving-cert-1772085155/tls.key\\\\\\\"\\\\nI0201 06:47:49.770494 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 06:47:49.775166 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 06:47:49.775201 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 06:47:49.775241 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 06:47:49.775252 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 06:47:49.784675 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 06:47:49.784715 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0201 06:47:49.784716 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0201 06:47:49.784724 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 06:47:49.784746 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 06:47:49.784754 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 06:47:49.784763 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 06:47:49.784769 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0201 06:47:49.787350 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.283112 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.283161 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.283178 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.283199 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.283217 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.293970 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ae9039d-790d-451e-893e-8da67d4e8057\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cffb221a6298dd5b3a0c4b96282fc5a196efa0f0a6c2d4eb2dd95a22014190af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e407f58fba9b8f9febff7651e4a154e669297d337cf105ea06a6eb18945e20b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d3b423f6d9db7e9ebe15ad3f909ad74a42f1bfd27b7870a24893b6b183cbca3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.315634 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cmdjj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4d959741-37e1-43e7-9ef6-5f33433f9447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:37Z\\\",\\\"message\\\":\\\"2026-02-01T06:47:52+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41\\\\n2026-02-01T06:47:52+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b6676a-04c5-495d-841e-7730e22f2b41 to /host/opt/cni/bin/\\\\n2026-02-01T06:47:52Z [verbose] multus-daemon started\\\\n2026-02-01T06:47:52Z [verbose] Readiness Indicator file check\\\\n2026-02-01T06:48:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klzfm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cmdjj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.329881 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kqhgg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"184f0be4-bae6-4988-8d01-862fa5745a14\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0aedb440da89758d0f4d22e325245e80c38f8e97800393caea168cced0e24cca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t6dpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:52Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kqhgg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.346049 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a16bde8c-7758-4d94-a246-bbafcff4d733\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24ecbcffc9e4d631e0d5067079752ab37eba8d4ec4ab9cff4c83af23bc37c955\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a85868a92544dafae3bb2addbca1b603343b32bfbc4416045b788e30a8f5a4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:48:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7j5pd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zjfhr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.361521 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bafc814f-6c41-40cf-b3f4-8babc6ec840a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vd8tm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:48:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ls5xc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.380281 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7addbaeecda60d047fe51fa2b83b4eabee209cf5cb66d4534aac27521af4c46c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.385472 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.385497 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.385508 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.385524 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.385535 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.395986 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.414496 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31eb743e-decb-4243-ae21-91cc7b399ce1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffe7a495284d2f6cfb72338597199dd78a58c8877979c1d2d0697369d764754\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ba2132c4edca26b373f8f19d4fdeca68daa989ee05ce1fdf7bef55e2d433c02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d7c89f02fcf7e05a53c2f2e5544a55a9177af823626b9189085f4a2b78b9768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50aae0abd97de30ab90881857835ee574b188a83ba43a45528eceadae6a222a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://01570c1cee83388f7f3898ceec29675cfe4f14fc825a02f2c5e4396627d268ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1f6eae3a2a6287d4e0ccbfdcaef50793d6d298102281832a41eb9731db251e0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb8db2ee5f0c8d8996849ba42cef00398581dcf8f64832134267dd048e0c8874\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-krjrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6d7gz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.429132 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpm98" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da77835b-2181-45cd-837e-b633fd15a3c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a3e78154d4b9584a4d20d11fb8d2c26880ac03baef503c9df220bd03caf39ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7jv5l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpm98\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.447809 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5034ec6a-7968-4592-a09b-a57a56ebdbc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T06:48:51Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 06:48:51.236836 7270 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 06:48:51.236903 7270 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 06:48:51.236959 7270 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 06:48:51.237065 7270 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 06:48:51.237084 7270 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 06:48:51.237147 7270 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 06:48:51.237178 7270 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 06:48:51.237217 7270 factory.go:656] Stopping watch factory\\\\nI0201 06:48:51.237241 7270 ovnkube.go:599] Stopped ovnkube\\\\nI0201 06:48:51.237276 7270 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 06:48:51.237299 7270 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 06:48:51.237313 7270 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 06:48:51.237328 7270 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 06:48:51.237342 7270 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 06:48:51.237355 7270 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 06:48:5\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T06:48:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ptwjj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-njlcv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.467614 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48bb61d7-313c-4fc7-b801-632e54ca1a7a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b6e9c0789ea3fba09e6acc8e9548395f6ad0333e9ac67f893c6338090943511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8908ea21cbb09bc49a45bb88c61d337966b1006166e92da4ba4cc50ecfe47568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://26b1ad49dcaaabd979a0f55208958b957478c7b93ef17cfd770fb6166c65e3e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://226c1b4a1565d1d3068623995df09dad0197b18855a84fb07f3fa49b393441db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://740ca69e35708127ed32e647e8cd9a203b0696d3508b0fb8876e544983ced563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d783b81865d0ee1c58cd84cfc82b4e54f5eeb4c6390299590da86b1a1c5f7b21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d783b81865d0ee1c58cd84cfc82b4e54f5eeb4c6390299590da86b1a1c5f7b21\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10b0811bc4ca1fcf16fe57a6692ff3c9b62fb94e4e464fb18868389cb55749f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10b0811bc4ca1fcf16fe57a6692ff3c9b62fb94e4e464fb18868389cb55749f7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b1fe9b291c33b77d23d194243ad05329ad869ee3b459f2291503d40f3079515f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1fe9b291c33b77d23d194243ad05329ad869ee3b459f2291503d40f3079515f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.480616 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"657a5f85-b23f-4e68-be14-2fd264da2784\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23f7ac4d7b2e2ea6b75fcf7f49014494abbd25aea81eaa50b7f85e3f7dd0eb00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0e32a799e47f5f779bdd9d3cd02d6a78730f1ec8133aeccfa2d28bf76c9d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5399dfb0b8839fa955a8b0a5c5e06dbdbbaa6aa4af31705356ea2b22207c2f84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e6a59f55754f58f9ef0211776a9c63497f286a05078069f112674c3751c0c98\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T06:47:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T06:47:31Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:30Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.487904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.487964 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.487974 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.488000 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.488013 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.498434 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6194088582dcfb759a95d03f4a70b5884362f4283b914dda68975dca2d90defd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.511735 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fbbf4cc2e2e5add30b8d7da3815a90ba2c0a42b303b266a009b188d53119f67f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5eb0acf350e867b48e7c45c97bb6a6861d9fe32b74a211b38f5a49e374719fd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.524370 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.558844 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.581602 5127 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"874ffcf5-fe2e-4225-a2a1-38f900cbffaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T06:47:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18ca0758a0f9bb7ec1f5f897e30d3804a0867aeb5892ad453fb870a430129028\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T06:47:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7t7p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T06:47:50Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-s2frk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T06:49:10Z is after 2025-08-24T17:21:41Z" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.590486 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.590666 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.590737 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.590816 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.590876 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.699841 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.699904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.699923 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.699950 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.699969 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.803072 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.803143 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.803167 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.803199 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.803222 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.906383 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.906448 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.906467 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.906493 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:10 crc kubenswrapper[5127]: I0201 06:49:10.906513 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:10Z","lastTransitionTime":"2026-02-01T06:49:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.009707 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.009755 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.009775 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.009802 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.009822 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.112938 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.113003 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.113025 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.113055 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.113383 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.216565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.216647 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.216670 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.216703 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.216723 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.240256 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 16:29:10.505914142 +0000 UTC Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.320509 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.320567 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.320621 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.320649 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.320672 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.424102 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.424158 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.424175 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.424198 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.424214 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.527658 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.527725 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.527743 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.527768 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.527788 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.630942 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.631005 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.631025 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.631051 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.631075 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.734395 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.734457 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.734513 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.734544 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.734567 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.836845 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.836904 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.836921 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.836944 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.836962 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.940494 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.940565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.940623 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.940648 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:11 crc kubenswrapper[5127]: I0201 06:49:11.940669 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:11Z","lastTransitionTime":"2026-02-01T06:49:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.044341 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.044403 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.044424 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.044456 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.044476 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.147350 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.147414 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.147432 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.147483 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.147500 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.234559 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.234633 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.234645 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.234622 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:12 crc kubenswrapper[5127]: E0201 06:49:12.234819 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:12 crc kubenswrapper[5127]: E0201 06:49:12.234962 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:12 crc kubenswrapper[5127]: E0201 06:49:12.235106 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:12 crc kubenswrapper[5127]: E0201 06:49:12.235206 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.240410 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 02:42:12.257916644 +0000 UTC Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.250362 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.250421 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.250443 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.250469 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.250492 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.353724 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.353771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.353789 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.353811 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.353826 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.457132 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.457514 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.457761 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.457963 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.458179 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.561706 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.561779 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.561797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.561823 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.561840 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.665292 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.665358 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.665374 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.665400 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.665420 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.768063 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.768135 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.768152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.768175 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.768219 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.871694 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.871768 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.871791 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.871817 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.871839 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.975408 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.975468 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.975484 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.975507 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:12 crc kubenswrapper[5127]: I0201 06:49:12.975525 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:12Z","lastTransitionTime":"2026-02-01T06:49:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.079560 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.079681 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.079702 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.079735 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.079758 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.183324 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.183390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.183407 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.183431 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.183448 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.241413 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 10:28:57.840326099 +0000 UTC Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.286981 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.287025 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.287044 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.287074 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.287098 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.390392 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.390452 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.390469 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.390494 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.390511 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.493155 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.493213 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.493228 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.493252 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.493271 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.596961 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.597116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.597135 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.597154 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.597169 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.700236 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.700324 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.700345 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.700377 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.700399 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.803933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.804007 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.804031 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.804062 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.804086 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.906783 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.906864 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.906883 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.906909 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:13 crc kubenswrapper[5127]: I0201 06:49:13.906926 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:13Z","lastTransitionTime":"2026-02-01T06:49:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.009867 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.009916 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.009939 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.009958 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.009971 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.113339 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.113408 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.113434 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.113463 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.113488 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.216646 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.216682 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.216690 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.216702 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.216711 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.235402 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.235485 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.235505 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:14 crc kubenswrapper[5127]: E0201 06:49:14.235551 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:14 crc kubenswrapper[5127]: E0201 06:49:14.235672 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:14 crc kubenswrapper[5127]: E0201 06:49:14.235771 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.235807 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:14 crc kubenswrapper[5127]: E0201 06:49:14.235866 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.241568 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:45:47.166556754 +0000 UTC Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.319870 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.319922 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.319940 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.319962 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.319981 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.422712 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.422769 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.422780 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.422804 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.422819 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.525618 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.525685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.525701 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.525725 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.525744 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.627869 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.627926 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.627942 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.627965 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.627983 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.730827 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.730890 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.730907 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.730931 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.730949 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.833211 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.833261 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.833276 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.833293 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.833304 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.937355 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.937516 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.937545 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.937653 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:14 crc kubenswrapper[5127]: I0201 06:49:14.937679 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:14Z","lastTransitionTime":"2026-02-01T06:49:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.041659 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.041725 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.041741 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.041766 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.041785 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.144346 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.144399 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.144407 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.144422 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.144459 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.241895 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 01:01:09.106909481 +0000 UTC Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.247285 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.247337 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.247354 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.247376 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.247394 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.350767 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.350844 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.350861 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.350887 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.350905 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.453575 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.453652 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.453671 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.453694 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.453712 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.556043 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.556080 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.556088 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.556101 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.556110 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.659659 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.659727 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.659746 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.659771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.659789 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.768949 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.769066 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.769096 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.769271 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.769363 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.872368 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.872419 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.872434 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.872454 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.872469 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.975824 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.975864 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.975876 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.975892 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:15 crc kubenswrapper[5127]: I0201 06:49:15.975904 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:15Z","lastTransitionTime":"2026-02-01T06:49:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.078828 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.078863 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.078874 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.078890 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.078902 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.181691 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.181725 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.181736 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.181752 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.181765 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.235601 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.235671 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:16 crc kubenswrapper[5127]: E0201 06:49:16.235721 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:16 crc kubenswrapper[5127]: E0201 06:49:16.235900 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.235963 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.236027 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:16 crc kubenswrapper[5127]: E0201 06:49:16.236133 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:16 crc kubenswrapper[5127]: E0201 06:49:16.236233 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.242147 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 23:55:34.943761484 +0000 UTC Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.284324 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.284399 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.284419 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.284460 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.284480 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.388103 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.388152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.388165 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.388183 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.388196 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.491058 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.491116 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.491132 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.491156 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.491174 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.594424 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.594467 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.594476 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.594490 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.594501 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.697426 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.697467 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.697475 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.697489 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.697497 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.801613 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.801685 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.801702 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.801730 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.801753 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.905797 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.905874 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.905893 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.905918 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:16 crc kubenswrapper[5127]: I0201 06:49:16.905935 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:16Z","lastTransitionTime":"2026-02-01T06:49:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.009018 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.009103 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.009125 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.009152 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.009169 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.112351 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.112410 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.112427 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.112450 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.112466 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.215701 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.216024 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.216202 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.216371 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.216505 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.242846 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 12:24:12.949066057 +0000 UTC Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.320157 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.320449 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.320636 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.320827 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.321160 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.424204 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.424917 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.424982 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.425021 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.425045 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.528382 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.528447 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.528466 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.528492 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.528510 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.632101 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.632163 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.632180 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.632205 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.632226 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.734835 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.734899 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.734916 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.734939 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.734957 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.838323 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.838392 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.838408 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.838432 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.838450 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.941561 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.941708 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.941729 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.941753 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:17 crc kubenswrapper[5127]: I0201 06:49:17.941769 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:17Z","lastTransitionTime":"2026-02-01T06:49:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.045688 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.045771 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.045795 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.045828 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.045851 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.149565 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.149673 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.149695 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.149724 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.149745 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.235748 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.235802 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.235893 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:18 crc kubenswrapper[5127]: E0201 06:49:18.236128 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.236476 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:18 crc kubenswrapper[5127]: E0201 06:49:18.236642 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:18 crc kubenswrapper[5127]: E0201 06:49:18.236904 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:18 crc kubenswrapper[5127]: E0201 06:49:18.237307 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.243426 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 20:07:02.50785243 +0000 UTC Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.252933 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.252977 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.252988 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.253007 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.253021 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.356127 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.356202 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.356224 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.356253 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.356275 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.459339 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.459390 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.459402 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.459422 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.459436 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.563679 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.563780 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.563803 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.563838 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.563865 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.666220 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.666258 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.666266 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.666279 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.666290 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.769226 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.769285 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.769302 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.769325 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.769344 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.873764 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.873828 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.873853 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.873883 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.873907 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.977093 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.977160 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.977181 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.977209 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:18 crc kubenswrapper[5127]: I0201 06:49:18.977235 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:18Z","lastTransitionTime":"2026-02-01T06:49:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.069921 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.069988 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.070007 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.070040 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.070059 5127 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T06:49:19Z","lastTransitionTime":"2026-02-01T06:49:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.146298 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7"] Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.147123 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.150398 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.150658 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.150514 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.152494 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.197044 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=25.197010868 podStartE2EDuration="25.197010868s" podCreationTimestamp="2026-02-01 06:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.196007761 +0000 UTC m=+109.681910214" watchObservedRunningTime="2026-02-01 06:49:19.197010868 +0000 UTC m=+109.682913271" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.201303 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a1b72d5c-de9d-432b-aac6-adb5b93c299a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.201370 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1b72d5c-de9d-432b-aac6-adb5b93c299a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.201425 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b72d5c-de9d-432b-aac6-adb5b93c299a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.201513 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a1b72d5c-de9d-432b-aac6-adb5b93c299a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.201545 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b72d5c-de9d-432b-aac6-adb5b93c299a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.241449 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.2414204 podStartE2EDuration="54.2414204s" podCreationTimestamp="2026-02-01 06:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.219899368 +0000 UTC m=+109.705801751" watchObservedRunningTime="2026-02-01 06:49:19.2414204 +0000 UTC m=+109.727322783" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.243954 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:59:38.990821975 +0000 UTC Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.244003 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.254303 5127 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.302358 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a1b72d5c-de9d-432b-aac6-adb5b93c299a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.302409 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b72d5c-de9d-432b-aac6-adb5b93c299a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.302458 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a1b72d5c-de9d-432b-aac6-adb5b93c299a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.302483 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1b72d5c-de9d-432b-aac6-adb5b93c299a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.302515 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b72d5c-de9d-432b-aac6-adb5b93c299a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.302930 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a1b72d5c-de9d-432b-aac6-adb5b93c299a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.303052 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a1b72d5c-de9d-432b-aac6-adb5b93c299a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.303869 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1b72d5c-de9d-432b-aac6-adb5b93c299a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.315036 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1b72d5c-de9d-432b-aac6-adb5b93c299a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.330931 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1b72d5c-de9d-432b-aac6-adb5b93c299a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7q6w7\" (UID: \"a1b72d5c-de9d-432b-aac6-adb5b93c299a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.333933 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podStartSLOduration=89.333912645 podStartE2EDuration="1m29.333912645s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.333320528 +0000 UTC m=+109.819222911" watchObservedRunningTime="2026-02-01 06:49:19.333912645 +0000 UTC m=+109.819815038" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.362865 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=36.36283551 podStartE2EDuration="36.36283551s" podCreationTimestamp="2026-02-01 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.361531855 +0000 UTC m=+109.847434228" watchObservedRunningTime="2026-02-01 06:49:19.36283551 +0000 UTC m=+109.848737903" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.396875 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=89.396854977 podStartE2EDuration="1m29.396854977s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.396184617 +0000 UTC m=+109.882086990" watchObservedRunningTime="2026-02-01 06:49:19.396854977 +0000 UTC m=+109.882757350" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.417825 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=83.417805733 podStartE2EDuration="1m23.417805733s" podCreationTimestamp="2026-02-01 06:47:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.417516745 +0000 UTC m=+109.903419128" watchObservedRunningTime="2026-02-01 06:49:19.417805733 +0000 UTC m=+109.903708106" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.435491 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cmdjj" podStartSLOduration=89.435458768 podStartE2EDuration="1m29.435458768s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.434951884 +0000 UTC m=+109.920854297" watchObservedRunningTime="2026-02-01 06:49:19.435458768 +0000 UTC m=+109.921361141" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.449807 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kqhgg" podStartSLOduration=89.449772502 podStartE2EDuration="1m29.449772502s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.44972317 +0000 UTC m=+109.935625543" watchObservedRunningTime="2026-02-01 06:49:19.449772502 +0000 UTC m=+109.935674875" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.473090 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.483705 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zjfhr" podStartSLOduration=89.483672705 podStartE2EDuration="1m29.483672705s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.465895805 +0000 UTC m=+109.951798188" watchObservedRunningTime="2026-02-01 06:49:19.483672705 +0000 UTC m=+109.969575078" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.567766 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6d7gz" podStartSLOduration=89.567737818 podStartE2EDuration="1m29.567737818s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.566236676 +0000 UTC m=+110.052139109" watchObservedRunningTime="2026-02-01 06:49:19.567737818 +0000 UTC m=+110.053640231" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.592540 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xpm98" podStartSLOduration=89.592509258 podStartE2EDuration="1m29.592509258s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.592147179 +0000 UTC m=+110.078049592" watchObservedRunningTime="2026-02-01 06:49:19.592509258 +0000 UTC m=+110.078411631" Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.830397 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" event={"ID":"a1b72d5c-de9d-432b-aac6-adb5b93c299a","Type":"ContainerStarted","Data":"3eda9b0cfc0dbecf7a7ae379a003be86c1351471a75bc7b9cae459c74b3ade1a"} Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.830479 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" event={"ID":"a1b72d5c-de9d-432b-aac6-adb5b93c299a","Type":"ContainerStarted","Data":"8a85d7b89b7f478354a71c0ea644988e55105d0be04e05a9bebd606d0e167417"} Feb 01 06:49:19 crc kubenswrapper[5127]: I0201 06:49:19.853463 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7q6w7" podStartSLOduration=89.853441128 podStartE2EDuration="1m29.853441128s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:19.852560053 +0000 UTC m=+110.338462476" watchObservedRunningTime="2026-02-01 06:49:19.853441128 +0000 UTC m=+110.339343501" Feb 01 06:49:20 crc kubenswrapper[5127]: I0201 06:49:20.234945 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:20 crc kubenswrapper[5127]: I0201 06:49:20.235030 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:20 crc kubenswrapper[5127]: I0201 06:49:20.235034 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:20 crc kubenswrapper[5127]: E0201 06:49:20.236706 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:20 crc kubenswrapper[5127]: I0201 06:49:20.236740 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:20 crc kubenswrapper[5127]: E0201 06:49:20.236997 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:20 crc kubenswrapper[5127]: E0201 06:49:20.237072 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:20 crc kubenswrapper[5127]: E0201 06:49:20.237200 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:20 crc kubenswrapper[5127]: I0201 06:49:20.238384 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 06:49:20 crc kubenswrapper[5127]: E0201 06:49:20.238681 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-njlcv_openshift-ovn-kubernetes(5034ec6a-7968-4592-a09b-a57a56ebdbc5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" Feb 01 06:49:22 crc kubenswrapper[5127]: I0201 06:49:22.235038 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:22 crc kubenswrapper[5127]: I0201 06:49:22.235128 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:22 crc kubenswrapper[5127]: I0201 06:49:22.235081 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:22 crc kubenswrapper[5127]: I0201 06:49:22.235056 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:22 crc kubenswrapper[5127]: E0201 06:49:22.235235 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:22 crc kubenswrapper[5127]: E0201 06:49:22.235465 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:22 crc kubenswrapper[5127]: E0201 06:49:22.235647 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:22 crc kubenswrapper[5127]: E0201 06:49:22.235813 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.238654 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:24 crc kubenswrapper[5127]: E0201 06:49:24.238836 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.239177 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:24 crc kubenswrapper[5127]: E0201 06:49:24.239283 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.239527 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:24 crc kubenswrapper[5127]: E0201 06:49:24.239682 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.241261 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:24 crc kubenswrapper[5127]: E0201 06:49:24.241413 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.850671 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/1.log" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.851325 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/0.log" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.851393 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d959741-37e1-43e7-9ef6-5f33433f9447" containerID="a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07" exitCode=1 Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.851440 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerDied","Data":"a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07"} Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.851529 5127 scope.go:117] "RemoveContainer" containerID="0f34f311111b23666287683ecc220a45b798807ff45dc5959fa93fd4d1507436" Feb 01 06:49:24 crc kubenswrapper[5127]: I0201 06:49:24.852114 5127 scope.go:117] "RemoveContainer" containerID="a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07" Feb 01 06:49:24 crc kubenswrapper[5127]: E0201 06:49:24.852368 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cmdjj_openshift-multus(4d959741-37e1-43e7-9ef6-5f33433f9447)\"" pod="openshift-multus/multus-cmdjj" podUID="4d959741-37e1-43e7-9ef6-5f33433f9447" Feb 01 06:49:25 crc kubenswrapper[5127]: I0201 06:49:25.858346 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/1.log" Feb 01 06:49:26 crc kubenswrapper[5127]: I0201 06:49:26.234825 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:26 crc kubenswrapper[5127]: I0201 06:49:26.234920 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:26 crc kubenswrapper[5127]: E0201 06:49:26.234971 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:26 crc kubenswrapper[5127]: E0201 06:49:26.235091 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:26 crc kubenswrapper[5127]: I0201 06:49:26.235157 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:26 crc kubenswrapper[5127]: E0201 06:49:26.235208 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:26 crc kubenswrapper[5127]: I0201 06:49:26.235254 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:26 crc kubenswrapper[5127]: E0201 06:49:26.235292 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:28 crc kubenswrapper[5127]: I0201 06:49:28.235505 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:28 crc kubenswrapper[5127]: I0201 06:49:28.235606 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:28 crc kubenswrapper[5127]: I0201 06:49:28.235777 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:28 crc kubenswrapper[5127]: E0201 06:49:28.235782 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:28 crc kubenswrapper[5127]: E0201 06:49:28.235927 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:28 crc kubenswrapper[5127]: E0201 06:49:28.236158 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:28 crc kubenswrapper[5127]: I0201 06:49:28.236812 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:28 crc kubenswrapper[5127]: E0201 06:49:28.236992 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:30 crc kubenswrapper[5127]: E0201 06:49:30.168031 5127 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 01 06:49:30 crc kubenswrapper[5127]: I0201 06:49:30.235447 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:30 crc kubenswrapper[5127]: I0201 06:49:30.235471 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:30 crc kubenswrapper[5127]: E0201 06:49:30.238110 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:30 crc kubenswrapper[5127]: I0201 06:49:30.238157 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:30 crc kubenswrapper[5127]: I0201 06:49:30.238133 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:30 crc kubenswrapper[5127]: E0201 06:49:30.238326 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:30 crc kubenswrapper[5127]: E0201 06:49:30.238428 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:30 crc kubenswrapper[5127]: E0201 06:49:30.238527 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:30 crc kubenswrapper[5127]: E0201 06:49:30.329884 5127 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 06:49:32 crc kubenswrapper[5127]: I0201 06:49:32.235557 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:32 crc kubenswrapper[5127]: I0201 06:49:32.235729 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:32 crc kubenswrapper[5127]: I0201 06:49:32.235558 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:32 crc kubenswrapper[5127]: E0201 06:49:32.235792 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:32 crc kubenswrapper[5127]: E0201 06:49:32.235940 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:32 crc kubenswrapper[5127]: I0201 06:49:32.236041 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:32 crc kubenswrapper[5127]: E0201 06:49:32.236150 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:32 crc kubenswrapper[5127]: E0201 06:49:32.236245 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:34 crc kubenswrapper[5127]: I0201 06:49:34.235874 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:34 crc kubenswrapper[5127]: I0201 06:49:34.235958 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:34 crc kubenswrapper[5127]: I0201 06:49:34.235973 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:34 crc kubenswrapper[5127]: E0201 06:49:34.236072 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:34 crc kubenswrapper[5127]: E0201 06:49:34.236236 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:34 crc kubenswrapper[5127]: I0201 06:49:34.236330 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:34 crc kubenswrapper[5127]: E0201 06:49:34.236462 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:34 crc kubenswrapper[5127]: E0201 06:49:34.236537 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:35 crc kubenswrapper[5127]: I0201 06:49:35.235692 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 06:49:35 crc kubenswrapper[5127]: E0201 06:49:35.331351 5127 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 06:49:35 crc kubenswrapper[5127]: I0201 06:49:35.897111 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/3.log" Feb 01 06:49:35 crc kubenswrapper[5127]: I0201 06:49:35.901305 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerStarted","Data":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} Feb 01 06:49:35 crc kubenswrapper[5127]: I0201 06:49:35.901775 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:49:35 crc kubenswrapper[5127]: I0201 06:49:35.949845 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podStartSLOduration=105.949814176 podStartE2EDuration="1m45.949814176s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:35.946485155 +0000 UTC m=+126.432387608" watchObservedRunningTime="2026-02-01 06:49:35.949814176 +0000 UTC m=+126.435716589" Feb 01 06:49:36 crc kubenswrapper[5127]: I0201 06:49:36.235338 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:36 crc kubenswrapper[5127]: I0201 06:49:36.235389 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:36 crc kubenswrapper[5127]: I0201 06:49:36.235403 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:36 crc kubenswrapper[5127]: I0201 06:49:36.235444 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:36 crc kubenswrapper[5127]: E0201 06:49:36.235567 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:36 crc kubenswrapper[5127]: E0201 06:49:36.235674 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:36 crc kubenswrapper[5127]: E0201 06:49:36.235751 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:36 crc kubenswrapper[5127]: E0201 06:49:36.235829 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:36 crc kubenswrapper[5127]: I0201 06:49:36.251990 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ls5xc"] Feb 01 06:49:36 crc kubenswrapper[5127]: I0201 06:49:36.904985 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:36 crc kubenswrapper[5127]: E0201 06:49:36.905686 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:37 crc kubenswrapper[5127]: I0201 06:49:37.236045 5127 scope.go:117] "RemoveContainer" containerID="a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07" Feb 01 06:49:37 crc kubenswrapper[5127]: I0201 06:49:37.911713 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/1.log" Feb 01 06:49:37 crc kubenswrapper[5127]: I0201 06:49:37.911793 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerStarted","Data":"cb8c2c5fe80a0d88b2acc8e86cef40eb17b13dca0c3ed6ce63bb1ca011ae4786"} Feb 01 06:49:38 crc kubenswrapper[5127]: I0201 06:49:38.234936 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:38 crc kubenswrapper[5127]: I0201 06:49:38.235006 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:38 crc kubenswrapper[5127]: E0201 06:49:38.235122 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:38 crc kubenswrapper[5127]: I0201 06:49:38.234961 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:38 crc kubenswrapper[5127]: E0201 06:49:38.235296 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:38 crc kubenswrapper[5127]: E0201 06:49:38.235403 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:39 crc kubenswrapper[5127]: I0201 06:49:39.234537 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:39 crc kubenswrapper[5127]: E0201 06:49:39.235149 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ls5xc" podUID="bafc814f-6c41-40cf-b3f4-8babc6ec840a" Feb 01 06:49:40 crc kubenswrapper[5127]: I0201 06:49:40.235563 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:40 crc kubenswrapper[5127]: I0201 06:49:40.235683 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:40 crc kubenswrapper[5127]: I0201 06:49:40.237487 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:40 crc kubenswrapper[5127]: E0201 06:49:40.237474 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 06:49:40 crc kubenswrapper[5127]: E0201 06:49:40.237644 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 06:49:40 crc kubenswrapper[5127]: E0201 06:49:40.237830 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 06:49:41 crc kubenswrapper[5127]: I0201 06:49:41.235408 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:49:41 crc kubenswrapper[5127]: I0201 06:49:41.238518 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 01 06:49:41 crc kubenswrapper[5127]: I0201 06:49:41.238575 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 01 06:49:42 crc kubenswrapper[5127]: I0201 06:49:42.235256 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:42 crc kubenswrapper[5127]: I0201 06:49:42.235349 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:42 crc kubenswrapper[5127]: I0201 06:49:42.235532 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:42 crc kubenswrapper[5127]: I0201 06:49:42.238127 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 01 06:49:42 crc kubenswrapper[5127]: I0201 06:49:42.241358 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 01 06:49:42 crc kubenswrapper[5127]: I0201 06:49:42.242027 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 01 06:49:42 crc kubenswrapper[5127]: I0201 06:49:42.246306 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.638517 5127 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.691692 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w4bsb"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.694026 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.693833 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.695524 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kvg79"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.695723 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.696384 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.696944 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5wzh"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.697785 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.700245 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.700861 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.704916 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.706083 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.706752 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.707414 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.713998 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.714341 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.715282 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 01 06:49:49 crc kubenswrapper[5127]: W0201 06:49:49.715544 5127 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 01 06:49:49 crc kubenswrapper[5127]: E0201 06:49:49.715647 5127 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.715835 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.716192 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.716472 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.716651 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.717249 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 01 06:49:49 crc kubenswrapper[5127]: W0201 06:49:49.717249 5127 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 01 06:49:49 crc kubenswrapper[5127]: E0201 06:49:49.717376 5127 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.717631 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 01 06:49:49 crc kubenswrapper[5127]: W0201 06:49:49.721337 5127 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 01 06:49:49 crc kubenswrapper[5127]: E0201 06:49:49.721397 5127 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.726146 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.726621 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.726843 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.726890 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.727039 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.727092 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.727287 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.727482 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.727645 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.727803 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.729215 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.729933 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.733535 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.733875 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.733903 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.734960 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 01 06:49:49 crc kubenswrapper[5127]: W0201 06:49:49.735172 5127 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 01 06:49:49 crc kubenswrapper[5127]: E0201 06:49:49.735223 5127 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.735575 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.735628 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.735803 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.736021 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: W0201 06:49:49.736117 5127 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 01 06:49:49 crc kubenswrapper[5127]: E0201 06:49:49.736146 5127 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.736289 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.736666 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.736702 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.736865 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.736926 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.737476 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.738426 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-zfpgn"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.739073 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.740666 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.740731 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.740859 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.741920 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.742169 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.742370 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.742556 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.742863 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.743035 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.744010 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz642"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.744519 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.744961 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-m947v"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.747052 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.747526 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.748153 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-8rw72"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.748662 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.748709 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8rw72" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.749810 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.750228 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.771093 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.774029 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxg44"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.774916 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.774991 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.775145 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.775190 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwdsw"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.775689 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.793733 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.794416 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.795347 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.795819 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796228 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796333 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796438 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796527 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796593 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796631 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796690 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796717 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796753 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796770 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796785 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796835 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796854 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796894 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796916 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796961 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796984 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796987 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797024 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797055 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.796961 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797132 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797151 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797223 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797298 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797369 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797386 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797459 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797472 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.797646 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.798845 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8jnf"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.799251 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.802019 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.805074 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.807418 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.808839 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-z8hkn"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.809487 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.818447 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.819230 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.820127 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.820489 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.820719 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.820882 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.821028 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.821146 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.821288 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.822279 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.822321 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.822425 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.823867 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.824045 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.824565 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.824966 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.825740 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.825843 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ct7v5"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.826337 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.834003 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.836555 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.841397 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.841641 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.842367 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.843877 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.844889 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853620 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-config\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853683 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-node-pullsecrets\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853710 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-serving-cert\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853736 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgjpf\" (UniqueName: \"kubernetes.io/projected/ff15e8e3-79fb-4691-99c3-8956ba943381-kube-api-access-dgjpf\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853755 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06682504-c11d-41d1-838a-a336640770a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853789 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b48d863-39c0-40ca-b016-2c92f284eace-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853810 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-client\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853837 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-config\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853860 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5e364462-04d8-45ed-b896-7a82db12c738-machine-approver-tls\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853882 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853905 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b48d863-39c0-40ca-b016-2c92f284eace-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853925 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-etcd-serving-ca\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853955 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c828d93e-bd87-4588-80af-0f6e69f9c81f-serving-cert\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853976 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-console-config\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.853996 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854017 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37ad4-73dd-4eba-967f-ce65cb3385bb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854027 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854038 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-trusted-ca-bundle\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854206 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854231 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-encryption-config\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854251 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-client-ca\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854272 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854822 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.854279 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-audit\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856658 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-config\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856680 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856701 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/81a435a5-ae11-4994-86f9-cdabafa80e4f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856718 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856736 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f6h8\" (UniqueName: \"kubernetes.io/projected/81a435a5-ae11-4994-86f9-cdabafa80e4f-kube-api-access-8f6h8\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856753 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c828d93e-bd87-4588-80af-0f6e69f9c81f-trusted-ca\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856767 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-serving-cert\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856783 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856800 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5vdf\" (UniqueName: \"kubernetes.io/projected/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-kube-api-access-x5vdf\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856817 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-etcd-client\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856832 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856846 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cf4\" (UniqueName: \"kubernetes.io/projected/c828d93e-bd87-4588-80af-0f6e69f9c81f-kube-api-access-f8cf4\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856861 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrnv\" (UniqueName: \"kubernetes.io/projected/5e364462-04d8-45ed-b896-7a82db12c738-kube-api-access-4rrnv\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856876 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856894 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04c86aea-4121-437a-98a3-e7d14024c548-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-czrhc\" (UID: \"04c86aea-4121-437a-98a3-e7d14024c548\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856910 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856925 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-oauth-serving-cert\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856933 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.856938 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qzhv\" (UniqueName: \"kubernetes.io/projected/99bc3500-5fb6-4d26-97dd-24dc06658294-kube-api-access-6qzhv\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857052 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkjww\" (UniqueName: \"kubernetes.io/projected/27bef954-7005-4626-9049-195b48a9365b-kube-api-access-lkjww\") pod \"downloads-7954f5f757-8rw72\" (UID: \"27bef954-7005-4626-9049-195b48a9365b\") " pod="openshift-console/downloads-7954f5f757-8rw72" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857071 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-images\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857109 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-audit-dir\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857125 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857143 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857158 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch6jw\" (UniqueName: \"kubernetes.io/projected/04c86aea-4121-437a-98a3-e7d14024c548-kube-api-access-ch6jw\") pod \"cluster-samples-operator-665b6dd947-czrhc\" (UID: \"04c86aea-4121-437a-98a3-e7d14024c548\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857173 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-image-import-ca\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857186 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-serving-cert\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857200 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857214 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcvhp\" (UniqueName: \"kubernetes.io/projected/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-kube-api-access-gcvhp\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857233 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c828d93e-bd87-4588-80af-0f6e69f9c81f-config\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857260 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qcgs\" (UniqueName: \"kubernetes.io/projected/ded37ad4-73dd-4eba-967f-ce65cb3385bb-kube-api-access-8qcgs\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857277 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-oauth-config\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857290 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-dir\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857307 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfm2j\" (UniqueName: \"kubernetes.io/projected/9b48d863-39c0-40ca-b016-2c92f284eace-kube-api-access-cfm2j\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857322 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857337 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkr67\" (UniqueName: \"kubernetes.io/projected/06682504-c11d-41d1-838a-a336640770a8-kube-api-access-nkr67\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857353 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857370 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857386 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff15e8e3-79fb-4691-99c3-8956ba943381-serving-cert\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857401 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81a435a5-ae11-4994-86f9-cdabafa80e4f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857417 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857431 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhcwz\" (UniqueName: \"kubernetes.io/projected/ef8c34c0-f858-472e-8560-5e7806b32eab-kube-api-access-hhcwz\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857445 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ded37ad4-73dd-4eba-967f-ce65cb3385bb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857449 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857472 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-config\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857486 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-service-ca\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857500 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-policies\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857516 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857532 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-audit-policies\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857546 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e364462-04d8-45ed-b896-7a82db12c738-auth-proxy-config\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857568 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-audit-dir\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857597 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpsnn\" (UniqueName: \"kubernetes.io/projected/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-kube-api-access-lpsnn\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857612 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e364462-04d8-45ed-b896-7a82db12c738-config\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857625 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a435a5-ae11-4994-86f9-cdabafa80e4f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.857641 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-encryption-config\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.859552 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.866170 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.866307 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.866792 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.867081 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-z7mc2"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.867352 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.867431 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.867500 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.867592 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.867651 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.867817 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.868072 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.868367 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pmp6b"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.868640 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.868835 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.868965 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.869271 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.869331 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.869524 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.869665 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-blb44"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.869914 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.869955 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.870229 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.870344 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zc4gp"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.870536 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.870646 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.870687 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.870933 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jpdck"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.871208 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w4bsb"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.871289 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.871513 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.871706 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.873661 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.874770 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.875971 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.876303 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxg44"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.876939 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zfpgn"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.877812 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ct7v5"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.878950 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz642"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.881476 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.881501 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.882931 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwdsw"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.884647 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.885385 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8rw72"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.886943 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.889242 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-m947v"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.889272 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5wzh"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.890319 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.891324 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8vpcc"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.892339 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.894193 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5zh5x"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.894832 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.896047 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-z8hkn"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.906556 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.916258 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.917279 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.918748 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.920471 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.920608 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pmp6b"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.922808 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.925109 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.927235 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.928871 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.930282 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kvg79"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.933538 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.934913 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-blb44"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.935591 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jpdck"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.936748 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8jnf"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.937704 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.937889 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.938748 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.939744 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.940728 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5zh5x"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.941894 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.943122 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.944348 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zc4gp"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.945523 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rgmvx"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.946648 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.946764 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rgmvx"] Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.955773 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958124 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5e364462-04d8-45ed-b896-7a82db12c738-machine-approver-tls\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958156 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958179 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-config\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958198 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-etcd-serving-ca\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958218 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b48d863-39c0-40ca-b016-2c92f284eace-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958235 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37ad4-73dd-4eba-967f-ce65cb3385bb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958257 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c828d93e-bd87-4588-80af-0f6e69f9c81f-serving-cert\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958293 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-console-config\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958315 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958331 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-trusted-ca-bundle\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958367 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktcpn\" (UniqueName: \"kubernetes.io/projected/015e8567-adb7-421c-8613-1611c4768cbe-kube-api-access-ktcpn\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958384 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958400 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7eea465-934a-4ec6-8eeb-2d7199fc3594-trusted-ca\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958423 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-encryption-config\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958441 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-client-ca\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958454 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-audit\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958469 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-config\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958484 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958499 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958522 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/81a435a5-ae11-4994-86f9-cdabafa80e4f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958541 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhv8x\" (UniqueName: \"kubernetes.io/projected/501ce6ad-6013-458e-b398-d0b8ca7c1915-kube-api-access-xhv8x\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958559 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f6h8\" (UniqueName: \"kubernetes.io/projected/81a435a5-ae11-4994-86f9-cdabafa80e4f-kube-api-access-8f6h8\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958592 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958617 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-cabundle\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958641 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c828d93e-bd87-4588-80af-0f6e69f9c81f-trusted-ca\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958661 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-serving-cert\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958677 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-etcd-client\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958693 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958735 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5vdf\" (UniqueName: \"kubernetes.io/projected/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-kube-api-access-x5vdf\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958777 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cf4\" (UniqueName: \"kubernetes.io/projected/c828d93e-bd87-4588-80af-0f6e69f9c81f-kube-api-access-f8cf4\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958816 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrnv\" (UniqueName: \"kubernetes.io/projected/5e364462-04d8-45ed-b896-7a82db12c738-kube-api-access-4rrnv\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958833 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958850 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkjww\" (UniqueName: \"kubernetes.io/projected/27bef954-7005-4626-9049-195b48a9365b-kube-api-access-lkjww\") pod \"downloads-7954f5f757-8rw72\" (UID: \"27bef954-7005-4626-9049-195b48a9365b\") " pod="openshift-console/downloads-7954f5f757-8rw72" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958854 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-config\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958886 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04c86aea-4121-437a-98a3-e7d14024c548-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-czrhc\" (UID: \"04c86aea-4121-437a-98a3-e7d14024c548\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958907 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958924 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-oauth-serving-cert\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958942 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qzhv\" (UniqueName: \"kubernetes.io/projected/99bc3500-5fb6-4d26-97dd-24dc06658294-kube-api-access-6qzhv\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958978 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-audit-dir\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.958993 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-images\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959009 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959043 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959061 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcvhp\" (UniqueName: \"kubernetes.io/projected/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-kube-api-access-gcvhp\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959078 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959094 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch6jw\" (UniqueName: \"kubernetes.io/projected/04c86aea-4121-437a-98a3-e7d14024c548-kube-api-access-ch6jw\") pod \"cluster-samples-operator-665b6dd947-czrhc\" (UID: \"04c86aea-4121-437a-98a3-e7d14024c548\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959129 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-image-import-ca\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959144 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-serving-cert\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959159 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c828d93e-bd87-4588-80af-0f6e69f9c81f-config\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959174 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qcgs\" (UniqueName: \"kubernetes.io/projected/ded37ad4-73dd-4eba-967f-ce65cb3385bb-kube-api-access-8qcgs\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959214 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl4br\" (UniqueName: \"kubernetes.io/projected/4bc57b8f-e457-482e-8c41-27fbb779b6a5-kube-api-access-kl4br\") pod \"dns-operator-744455d44c-z8hkn\" (UID: \"4bc57b8f-e457-482e-8c41-27fbb779b6a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959236 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-console-config\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959231 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7eea465-934a-4ec6-8eeb-2d7199fc3594-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959264 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-audit\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959284 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxrvq\" (UniqueName: \"kubernetes.io/projected/d7eea465-934a-4ec6-8eeb-2d7199fc3594-kube-api-access-lxrvq\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959307 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959328 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-oauth-config\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959353 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-dir\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959375 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfm2j\" (UniqueName: \"kubernetes.io/projected/9b48d863-39c0-40ca-b016-2c92f284eace-kube-api-access-cfm2j\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959466 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-key\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959501 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkr67\" (UniqueName: \"kubernetes.io/projected/06682504-c11d-41d1-838a-a336640770a8-kube-api-access-nkr67\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959519 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959546 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959770 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd2288fe-2a63-4283-946c-5c5f1891a007-cert\") pod \"ingress-canary-jpdck\" (UID: \"bd2288fe-2a63-4283-946c-5c5f1891a007\") " pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959825 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/015e8567-adb7-421c-8613-1611c4768cbe-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959851 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff15e8e3-79fb-4691-99c3-8956ba943381-serving-cert\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959897 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81a435a5-ae11-4994-86f9-cdabafa80e4f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.959925 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ded37ad4-73dd-4eba-967f-ce65cb3385bb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960017 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960039 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhcwz\" (UniqueName: \"kubernetes.io/projected/ef8c34c0-f858-472e-8560-5e7806b32eab-kube-api-access-hhcwz\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960151 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960176 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z27bk\" (UniqueName: \"kubernetes.io/projected/bd2288fe-2a63-4283-946c-5c5f1891a007-kube-api-access-z27bk\") pod \"ingress-canary-jpdck\" (UID: \"bd2288fe-2a63-4283-946c-5c5f1891a007\") " pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960210 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-config\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960233 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-service-ca\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960253 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-policies\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960283 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-audit-policies\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960302 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e364462-04d8-45ed-b896-7a82db12c738-auth-proxy-config\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960323 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-audit-dir\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960325 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-config\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960342 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpsnn\" (UniqueName: \"kubernetes.io/projected/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-kube-api-access-lpsnn\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960399 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-encryption-config\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960440 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e364462-04d8-45ed-b896-7a82db12c738-config\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a435a5-ae11-4994-86f9-cdabafa80e4f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960492 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-config\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960544 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/015e8567-adb7-421c-8613-1611c4768cbe-images\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960566 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/015e8567-adb7-421c-8613-1611c4768cbe-proxy-tls\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960621 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-node-pullsecrets\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960644 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-serving-cert\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960687 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bc57b8f-e457-482e-8c41-27fbb779b6a5-metrics-tls\") pod \"dns-operator-744455d44c-z8hkn\" (UID: \"4bc57b8f-e457-482e-8c41-27fbb779b6a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960712 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7eea465-934a-4ec6-8eeb-2d7199fc3594-metrics-tls\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960734 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06682504-c11d-41d1-838a-a336640770a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960800 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgjpf\" (UniqueName: \"kubernetes.io/projected/ff15e8e3-79fb-4691-99c3-8956ba943381-kube-api-access-dgjpf\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960852 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-client\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960880 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b48d863-39c0-40ca-b016-2c92f284eace-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.960993 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-etcd-serving-ca\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.961944 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b48d863-39c0-40ca-b016-2c92f284eace-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.963253 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded37ad4-73dd-4eba-967f-ce65cb3385bb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.963484 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.963487 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.963800 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.963917 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.964043 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-service-ca-bundle\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.964388 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b48d863-39c0-40ca-b016-2c92f284eace-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.964622 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.965241 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-image-import-ca\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.965792 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.965879 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-dir\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.965904 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-policies\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.965988 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81a435a5-ae11-4994-86f9-cdabafa80e4f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.966004 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-audit-dir\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.966289 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff15e8e3-79fb-4691-99c3-8956ba943381-config\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.966410 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-node-pullsecrets\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.966438 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5e364462-04d8-45ed-b896-7a82db12c738-auth-proxy-config\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.966729 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-service-ca\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.966825 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-trusted-ca-bundle\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.966860 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-client-ca\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.967222 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff15e8e3-79fb-4691-99c3-8956ba943381-serving-cert\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.967260 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c828d93e-bd87-4588-80af-0f6e69f9c81f-config\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.967325 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.967423 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.967522 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-audit-dir\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.967808 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-audit-policies\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.968063 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c828d93e-bd87-4588-80af-0f6e69f9c81f-trusted-ca\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.968079 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-oauth-serving-cert\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.968118 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.968224 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.968675 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c828d93e-bd87-4588-80af-0f6e69f9c81f-serving-cert\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.968962 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e364462-04d8-45ed-b896-7a82db12c738-config\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.968982 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-encryption-config\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.969023 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.969340 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-serving-cert\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.969422 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ded37ad4-73dd-4eba-967f-ce65cb3385bb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.969548 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-config\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.969803 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.970102 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-oauth-config\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.970144 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-serving-cert\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.970534 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-etcd-client\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.970694 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/04c86aea-4121-437a-98a3-e7d14024c548-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-czrhc\" (UID: \"04c86aea-4121-437a-98a3-e7d14024c548\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.970831 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5e364462-04d8-45ed-b896-7a82db12c738-machine-approver-tls\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.970862 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-serving-cert\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.971192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.971791 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/81a435a5-ae11-4994-86f9-cdabafa80e4f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.972006 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-encryption-config\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.972687 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06682504-c11d-41d1-838a-a336640770a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.976438 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.977385 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.981942 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:49 crc kubenswrapper[5127]: I0201 06:49:49.995868 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.016942 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.035982 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.056522 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.062006 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktcpn\" (UniqueName: \"kubernetes.io/projected/015e8567-adb7-421c-8613-1611c4768cbe-kube-api-access-ktcpn\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.062045 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7eea465-934a-4ec6-8eeb-2d7199fc3594-trusted-ca\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.062080 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhv8x\" (UniqueName: \"kubernetes.io/projected/501ce6ad-6013-458e-b398-d0b8ca7c1915-kube-api-access-xhv8x\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.062113 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-cabundle\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.062482 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl4br\" (UniqueName: \"kubernetes.io/projected/4bc57b8f-e457-482e-8c41-27fbb779b6a5-kube-api-access-kl4br\") pod \"dns-operator-744455d44c-z8hkn\" (UID: \"4bc57b8f-e457-482e-8c41-27fbb779b6a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063034 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-key\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063062 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7eea465-934a-4ec6-8eeb-2d7199fc3594-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063078 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxrvq\" (UniqueName: \"kubernetes.io/projected/d7eea465-934a-4ec6-8eeb-2d7199fc3594-kube-api-access-lxrvq\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063211 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd2288fe-2a63-4283-946c-5c5f1891a007-cert\") pod \"ingress-canary-jpdck\" (UID: \"bd2288fe-2a63-4283-946c-5c5f1891a007\") " pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063231 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/015e8567-adb7-421c-8613-1611c4768cbe-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063333 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z27bk\" (UniqueName: \"kubernetes.io/projected/bd2288fe-2a63-4283-946c-5c5f1891a007-kube-api-access-z27bk\") pod \"ingress-canary-jpdck\" (UID: \"bd2288fe-2a63-4283-946c-5c5f1891a007\") " pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063399 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bc57b8f-e457-482e-8c41-27fbb779b6a5-metrics-tls\") pod \"dns-operator-744455d44c-z8hkn\" (UID: \"4bc57b8f-e457-482e-8c41-27fbb779b6a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063426 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7eea465-934a-4ec6-8eeb-2d7199fc3594-metrics-tls\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063447 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/015e8567-adb7-421c-8613-1611c4768cbe-images\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063464 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/015e8567-adb7-421c-8613-1611c4768cbe-proxy-tls\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.063659 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d7eea465-934a-4ec6-8eeb-2d7199fc3594-trusted-ca\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.064404 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/015e8567-adb7-421c-8613-1611c4768cbe-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.066534 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4bc57b8f-e457-482e-8c41-27fbb779b6a5-metrics-tls\") pod \"dns-operator-744455d44c-z8hkn\" (UID: \"4bc57b8f-e457-482e-8c41-27fbb779b6a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.066876 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7eea465-934a-4ec6-8eeb-2d7199fc3594-metrics-tls\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.097001 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.116115 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.136924 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.158252 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.176699 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.196130 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.216545 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.236511 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.256909 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.276018 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.296005 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.316792 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.336314 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.356518 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.376450 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.396671 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.416805 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.436410 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.456339 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.476640 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.496841 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.516446 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.537153 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.556492 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.576796 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.597307 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.616444 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.637020 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.656542 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.676682 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.696821 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.716568 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.736266 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.758548 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.776542 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.797311 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.816633 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.836809 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.857704 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.874500 5127 request.go:700] Waited for 1.006545303s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serving-cert&limit=500&resourceVersion=0 Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.876764 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.896920 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.916397 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.936988 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.956739 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 01 06:49:50 crc kubenswrapper[5127]: E0201 06:49:50.967800 5127 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 01 06:49:50 crc kubenswrapper[5127]: E0201 06:49:50.967979 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-images podName:2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c nodeName:}" failed. No retries permitted until 2026-02-01 06:49:51.467940416 +0000 UTC m=+141.953842819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-images") pod "machine-api-operator-5694c8668f-kvg79" (UID: "2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c") : failed to sync configmap cache: timed out waiting for the condition Feb 01 06:49:50 crc kubenswrapper[5127]: E0201 06:49:50.968449 5127 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:50 crc kubenswrapper[5127]: E0201 06:49:50.968563 5127 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:50 crc kubenswrapper[5127]: E0201 06:49:50.968654 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-client podName:3e586b38-fbcb-4142-b1b0-46f7824f9cc5 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:51.468551574 +0000 UTC m=+141.954453977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-client") pod "apiserver-7bbb656c7d-fr6ms" (UID: "3e586b38-fbcb-4142-b1b0-46f7824f9cc5") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:50 crc kubenswrapper[5127]: E0201 06:49:50.968685 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-machine-api-operator-tls podName:2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c nodeName:}" failed. No retries permitted until 2026-02-01 06:49:51.468672427 +0000 UTC m=+141.954574830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-kvg79" (UID: "2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.977396 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 01 06:49:50 crc kubenswrapper[5127]: I0201 06:49:50.996670 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.015556 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.027524 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/015e8567-adb7-421c-8613-1611c4768cbe-proxy-tls\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.035818 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.044347 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/015e8567-adb7-421c-8613-1611c4768cbe-images\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.057051 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 01 06:49:51 crc kubenswrapper[5127]: E0201 06:49:51.062418 5127 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Feb 01 06:49:51 crc kubenswrapper[5127]: E0201 06:49:51.062649 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-cabundle podName:501ce6ad-6013-458e-b398-d0b8ca7c1915 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:51.562628156 +0000 UTC m=+142.048530589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-cabundle") pod "service-ca-9c57cc56f-blb44" (UID: "501ce6ad-6013-458e-b398-d0b8ca7c1915") : failed to sync configmap cache: timed out waiting for the condition Feb 01 06:49:51 crc kubenswrapper[5127]: E0201 06:49:51.063254 5127 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:51 crc kubenswrapper[5127]: E0201 06:49:51.063361 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-key podName:501ce6ad-6013-458e-b398-d0b8ca7c1915 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:51.563335156 +0000 UTC m=+142.049237529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-key") pod "service-ca-9c57cc56f-blb44" (UID: "501ce6ad-6013-458e-b398-d0b8ca7c1915") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:51 crc kubenswrapper[5127]: E0201 06:49:51.063424 5127 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:51 crc kubenswrapper[5127]: E0201 06:49:51.063525 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd2288fe-2a63-4283-946c-5c5f1891a007-cert podName:bd2288fe-2a63-4283-946c-5c5f1891a007 nodeName:}" failed. No retries permitted until 2026-02-01 06:49:51.56350376 +0000 UTC m=+142.049406133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bd2288fe-2a63-4283-946c-5c5f1891a007-cert") pod "ingress-canary-jpdck" (UID: "bd2288fe-2a63-4283-946c-5c5f1891a007") : failed to sync secret cache: timed out waiting for the condition Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.075638 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.099408 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.116567 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.136298 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.156904 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.177362 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.197106 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.216929 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.236273 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.256750 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.277873 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.297379 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.316749 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.336947 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.356831 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.376883 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.408342 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.416466 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.436019 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.456017 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.477087 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.483674 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-client\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.483882 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.483950 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-images\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.496254 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.516295 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.537235 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.556275 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.577366 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.586846 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd2288fe-2a63-4283-946c-5c5f1891a007-cert\") pod \"ingress-canary-jpdck\" (UID: \"bd2288fe-2a63-4283-946c-5c5f1891a007\") " pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.587028 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-cabundle\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.587138 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-key\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.588733 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-cabundle\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.591993 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/501ce6ad-6013-458e-b398-d0b8ca7c1915-signing-key\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.592867 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd2288fe-2a63-4283-946c-5c5f1891a007-cert\") pod \"ingress-canary-jpdck\" (UID: \"bd2288fe-2a63-4283-946c-5c5f1891a007\") " pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.596185 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.617216 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.636918 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.657566 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.697784 5127 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.716875 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.737153 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.781937 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpsnn\" (UniqueName: \"kubernetes.io/projected/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-kube-api-access-lpsnn\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.816204 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfm2j\" (UniqueName: \"kubernetes.io/projected/9b48d863-39c0-40ca-b016-2c92f284eace-kube-api-access-cfm2j\") pod \"openshift-apiserver-operator-796bbdcf4f-lhf7v\" (UID: \"9b48d863-39c0-40ca-b016-2c92f284eace\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.831485 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkr67\" (UniqueName: \"kubernetes.io/projected/06682504-c11d-41d1-838a-a336640770a8-kube-api-access-nkr67\") pod \"route-controller-manager-6576b87f9c-mpbrq\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.849554 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcvhp\" (UniqueName: \"kubernetes.io/projected/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-kube-api-access-gcvhp\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.864566 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch6jw\" (UniqueName: \"kubernetes.io/projected/04c86aea-4121-437a-98a3-e7d14024c548-kube-api-access-ch6jw\") pod \"cluster-samples-operator-665b6dd947-czrhc\" (UID: \"04c86aea-4121-437a-98a3-e7d14024c548\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.874546 5127 request.go:700] Waited for 1.907589645s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.882752 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5vdf\" (UniqueName: \"kubernetes.io/projected/b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45-kube-api-access-x5vdf\") pod \"apiserver-76f77b778f-w4bsb\" (UID: \"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45\") " pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.896485 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qcgs\" (UniqueName: \"kubernetes.io/projected/ded37ad4-73dd-4eba-967f-ce65cb3385bb-kube-api-access-8qcgs\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdjnm\" (UID: \"ded37ad4-73dd-4eba-967f-ce65cb3385bb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.922167 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrnv\" (UniqueName: \"kubernetes.io/projected/5e364462-04d8-45ed-b896-7a82db12c738-kube-api-access-4rrnv\") pod \"machine-approver-56656f9798-mcc7n\" (UID: \"5e364462-04d8-45ed-b896-7a82db12c738\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.922374 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.938204 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.939749 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cf4\" (UniqueName: \"kubernetes.io/projected/c828d93e-bd87-4588-80af-0f6e69f9c81f-kube-api-access-f8cf4\") pod \"console-operator-58897d9998-m947v\" (UID: \"c828d93e-bd87-4588-80af-0f6e69f9c81f\") " pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.965121 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f6h8\" (UniqueName: \"kubernetes.io/projected/81a435a5-ae11-4994-86f9-cdabafa80e4f-kube-api-access-8f6h8\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.965208 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.985834 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhcwz\" (UniqueName: \"kubernetes.io/projected/ef8c34c0-f858-472e-8560-5e7806b32eab-kube-api-access-hhcwz\") pod \"oauth-openshift-558db77b4-kz642\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.997109 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" Feb 01 06:49:51 crc kubenswrapper[5127]: I0201 06:49:51.997592 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.001962 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkjww\" (UniqueName: \"kubernetes.io/projected/27bef954-7005-4626-9049-195b48a9365b-kube-api-access-lkjww\") pod \"downloads-7954f5f757-8rw72\" (UID: \"27bef954-7005-4626-9049-195b48a9365b\") " pod="openshift-console/downloads-7954f5f757-8rw72" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.007411 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.017288 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgjpf\" (UniqueName: \"kubernetes.io/projected/ff15e8e3-79fb-4691-99c3-8956ba943381-kube-api-access-dgjpf\") pod \"authentication-operator-69f744f599-k5wzh\" (UID: \"ff15e8e3-79fb-4691-99c3-8956ba943381\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.020706 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-8rw72" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.038439 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qzhv\" (UniqueName: \"kubernetes.io/projected/99bc3500-5fb6-4d26-97dd-24dc06658294-kube-api-access-6qzhv\") pod \"console-f9d7485db-zfpgn\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.054107 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a435a5-ae11-4994-86f9-cdabafa80e4f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f8hq9\" (UID: \"81a435a5-ae11-4994-86f9-cdabafa80e4f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.078245 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktcpn\" (UniqueName: \"kubernetes.io/projected/015e8567-adb7-421c-8613-1611c4768cbe-kube-api-access-ktcpn\") pod \"machine-config-operator-74547568cd-5pg2r\" (UID: \"015e8567-adb7-421c-8613-1611c4768cbe\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.089986 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.102566 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhv8x\" (UniqueName: \"kubernetes.io/projected/501ce6ad-6013-458e-b398-d0b8ca7c1915-kube-api-access-xhv8x\") pod \"service-ca-9c57cc56f-blb44\" (UID: \"501ce6ad-6013-458e-b398-d0b8ca7c1915\") " pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.127135 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.129544 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl4br\" (UniqueName: \"kubernetes.io/projected/4bc57b8f-e457-482e-8c41-27fbb779b6a5-kube-api-access-kl4br\") pod \"dns-operator-744455d44c-z8hkn\" (UID: \"4bc57b8f-e457-482e-8c41-27fbb779b6a5\") " pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.137736 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7eea465-934a-4ec6-8eeb-2d7199fc3594-bound-sa-token\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.151673 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxrvq\" (UniqueName: \"kubernetes.io/projected/d7eea465-934a-4ec6-8eeb-2d7199fc3594-kube-api-access-lxrvq\") pod \"ingress-operator-5b745b69d9-r6zts\" (UID: \"d7eea465-934a-4ec6-8eeb-2d7199fc3594\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.175151 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z27bk\" (UniqueName: \"kubernetes.io/projected/bd2288fe-2a63-4283-946c-5c5f1891a007-kube-api-access-z27bk\") pod \"ingress-canary-jpdck\" (UID: \"bd2288fe-2a63-4283-946c-5c5f1891a007\") " pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.196100 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.201804 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.207054 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-images\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.213227 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.213523 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.217953 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.218021 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-blb44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.241033 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.247760 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jpdck" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.259668 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.268010 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3e586b38-fbcb-4142-b1b0-46f7824f9cc5-etcd-client\") pod \"apiserver-7bbb656c7d-fr6ms\" (UID: \"3e586b38-fbcb-4142-b1b0-46f7824f9cc5\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.270435 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.276460 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.294065 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kvg79\" (UID: \"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316319 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghnrs\" (UniqueName: \"kubernetes.io/projected/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-kube-api-access-ghnrs\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316430 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-trusted-ca\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316456 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-client-ca\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316520 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316539 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82v86\" (UniqueName: \"kubernetes.io/projected/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-kube-api-access-82v86\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316560 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpw4\" (UniqueName: \"kubernetes.io/projected/f542f82a-9107-4056-9beb-0fcc49df176a-kube-api-access-jjpw4\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316597 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316635 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwqzx\" (UniqueName: \"kubernetes.io/projected/bd3e5544-39fe-4dfe-843f-b9281085274e-kube-api-access-jwqzx\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316677 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-bound-sa-token\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316699 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6c0fb18-dc93-4aed-abd0-55631d324b99-config-volume\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316717 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af3eca47-a30a-4d71-812b-01b5719b08e9-service-ca-bundle\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316738 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3e5544-39fe-4dfe-843f-b9281085274e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316782 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/506aef5c-d853-41d9-94dd-0194fb7ac45a-srv-cert\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316804 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316827 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f90c88e9-7849-4ec4-9df3-311426864686-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316847 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316868 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05432485-7c44-439d-99d6-27fa4f4f3746-apiservice-cert\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316886 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316921 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25a3f596-1eb9-4bdb-afe9-902545ed5197-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316943 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-config\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316975 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szrk4\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-kube-api-access-szrk4\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.316996 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/85e1eb9a-c828-43bc-9d44-4071fb0a1210-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pmp6b\" (UID: \"85e1eb9a-c828-43bc-9d44-4071fb0a1210\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317016 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-metrics-certs\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317059 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-service-ca\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317079 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-registry-certificates\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317096 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317118 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-serving-cert\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317139 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsh2m\" (UniqueName: \"kubernetes.io/projected/85e1eb9a-c828-43bc-9d44-4071fb0a1210-kube-api-access-rsh2m\") pod \"multus-admission-controller-857f4d67dd-pmp6b\" (UID: \"85e1eb9a-c828-43bc-9d44-4071fb0a1210\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317161 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-serving-cert\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317177 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-client\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317214 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-config\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317235 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvrsx\" (UniqueName: \"kubernetes.io/projected/98ec7bfa-b340-4f65-ada5-db427bd64a4e-kube-api-access-tvrsx\") pod \"package-server-manager-789f6589d5-lc9tz\" (UID: \"98ec7bfa-b340-4f65-ada5-db427bd64a4e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317251 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-default-certificate\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317279 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317304 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qftc6\" (UniqueName: \"kubernetes.io/projected/12e95361-86af-4218-b7f7-56582c3a17b7-kube-api-access-qftc6\") pod \"control-plane-machine-set-operator-78cbb6b69f-pdls2\" (UID: \"12e95361-86af-4218-b7f7-56582c3a17b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317351 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317370 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0d21a97a-a2c1-4296-b080-8af4e4a22638-node-bootstrap-token\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317403 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a3f596-1eb9-4bdb-afe9-902545ed5197-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317468 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7n9r\" (UniqueName: \"kubernetes.io/projected/b6c0fb18-dc93-4aed-abd0-55631d324b99-kube-api-access-m7n9r\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317543 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfmxg\" (UniqueName: \"kubernetes.io/projected/84d1d573-370c-47e5-aab1-aee630e9aef0-kube-api-access-pfmxg\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317563 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/506aef5c-d853-41d9-94dd-0194fb7ac45a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317606 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f90c88e9-7849-4ec4-9df3-311426864686-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.317623 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-metrics-tls\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318491 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txlsd\" (UniqueName: \"kubernetes.io/projected/0d21a97a-a2c1-4296-b080-8af4e4a22638-kube-api-access-txlsd\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318594 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lblm\" (UniqueName: \"kubernetes.io/projected/abc25820-f936-4169-996c-d337ec58713b-kube-api-access-7lblm\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318647 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318681 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e5544-39fe-4dfe-843f-b9281085274e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318704 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-config-volume\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318731 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-registry-tls\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318755 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-ca\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318772 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f542f82a-9107-4056-9beb-0fcc49df176a-profile-collector-cert\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318948 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x76s\" (UniqueName: \"kubernetes.io/projected/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-kube-api-access-6x76s\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318975 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nv98\" (UniqueName: \"kubernetes.io/projected/2d5b2407-6eb7-4bac-8c75-cdd0820f3974-kube-api-access-2nv98\") pod \"migrator-59844c95c7-vcfxq\" (UID: \"2d5b2407-6eb7-4bac-8c75-cdd0820f3974\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.318997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-config\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319017 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-stats-auth\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319069 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6c0fb18-dc93-4aed-abd0-55631d324b99-secret-volume\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319103 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f542f82a-9107-4056-9beb-0fcc49df176a-srv-cert\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319136 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nr6g\" (UniqueName: \"kubernetes.io/projected/05432485-7c44-439d-99d6-27fa4f4f3746-kube-api-access-9nr6g\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319161 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-config\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319181 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-config\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319200 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319261 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/05432485-7c44-439d-99d6-27fa4f4f3746-tmpfs\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319311 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-serving-cert\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319331 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319350 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzdn7\" (UniqueName: \"kubernetes.io/projected/af3eca47-a30a-4d71-812b-01b5719b08e9-kube-api-access-jzdn7\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319389 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7kbl\" (UniqueName: \"kubernetes.io/projected/506aef5c-d853-41d9-94dd-0194fb7ac45a-kube-api-access-d7kbl\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319408 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb55p\" (UniqueName: \"kubernetes.io/projected/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-kube-api-access-kb55p\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319427 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0d21a97a-a2c1-4296-b080-8af4e4a22638-certs\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319465 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05432485-7c44-439d-99d6-27fa4f4f3746-webhook-cert\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319499 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcxlv\" (UniqueName: \"kubernetes.io/projected/25a3f596-1eb9-4bdb-afe9-902545ed5197-kube-api-access-lcxlv\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319522 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abc25820-f936-4169-996c-d337ec58713b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319554 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abc25820-f936-4169-996c-d337ec58713b-proxy-tls\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319654 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/12e95361-86af-4218-b7f7-56582c3a17b7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pdls2\" (UID: \"12e95361-86af-4218-b7f7-56582c3a17b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.319679 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/98ec7bfa-b340-4f65-ada5-db427bd64a4e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lc9tz\" (UID: \"98ec7bfa-b340-4f65-ada5-db427bd64a4e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.330076 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.332486 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:52.832470566 +0000 UTC m=+143.318372929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.352715 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.409792 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.420895 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.421079 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:52.921050405 +0000 UTC m=+143.406952768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421266 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f90c88e9-7849-4ec4-9df3-311426864686-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421288 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-metrics-tls\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421317 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txlsd\" (UniqueName: \"kubernetes.io/projected/0d21a97a-a2c1-4296-b080-8af4e4a22638-kube-api-access-txlsd\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421344 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lblm\" (UniqueName: \"kubernetes.io/projected/abc25820-f936-4169-996c-d337ec58713b-kube-api-access-7lblm\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421359 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421391 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e5544-39fe-4dfe-843f-b9281085274e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421407 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-config-volume\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421427 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-mountpoint-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421442 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-plugins-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421458 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f542f82a-9107-4056-9beb-0fcc49df176a-profile-collector-cert\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421473 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-registry-tls\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421498 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-ca\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421818 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x76s\" (UniqueName: \"kubernetes.io/projected/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-kube-api-access-6x76s\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421837 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nv98\" (UniqueName: \"kubernetes.io/projected/2d5b2407-6eb7-4bac-8c75-cdd0820f3974-kube-api-access-2nv98\") pod \"migrator-59844c95c7-vcfxq\" (UID: \"2d5b2407-6eb7-4bac-8c75-cdd0820f3974\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421879 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-config\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421894 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-stats-auth\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421922 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6c0fb18-dc93-4aed-abd0-55631d324b99-secret-volume\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421935 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f542f82a-9107-4056-9beb-0fcc49df176a-srv-cert\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421953 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nr6g\" (UniqueName: \"kubernetes.io/projected/05432485-7c44-439d-99d6-27fa4f4f3746-kube-api-access-9nr6g\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421970 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-config\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.421985 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-config\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422000 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422015 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/05432485-7c44-439d-99d6-27fa4f4f3746-tmpfs\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422033 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qls78\" (UniqueName: \"kubernetes.io/projected/339e4434-c20b-49ab-8a89-234c633c788b-kube-api-access-qls78\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422049 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-serving-cert\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422064 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422081 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzdn7\" (UniqueName: \"kubernetes.io/projected/af3eca47-a30a-4d71-812b-01b5719b08e9-kube-api-access-jzdn7\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422105 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7kbl\" (UniqueName: \"kubernetes.io/projected/506aef5c-d853-41d9-94dd-0194fb7ac45a-kube-api-access-d7kbl\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422121 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb55p\" (UniqueName: \"kubernetes.io/projected/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-kube-api-access-kb55p\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422136 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0d21a97a-a2c1-4296-b080-8af4e4a22638-certs\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422152 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05432485-7c44-439d-99d6-27fa4f4f3746-webhook-cert\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422190 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcxlv\" (UniqueName: \"kubernetes.io/projected/25a3f596-1eb9-4bdb-afe9-902545ed5197-kube-api-access-lcxlv\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422207 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abc25820-f936-4169-996c-d337ec58713b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422221 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abc25820-f936-4169-996c-d337ec58713b-proxy-tls\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422376 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/98ec7bfa-b340-4f65-ada5-db427bd64a4e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lc9tz\" (UID: \"98ec7bfa-b340-4f65-ada5-db427bd64a4e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422407 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/12e95361-86af-4218-b7f7-56582c3a17b7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pdls2\" (UID: \"12e95361-86af-4218-b7f7-56582c3a17b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422427 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghnrs\" (UniqueName: \"kubernetes.io/projected/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-kube-api-access-ghnrs\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422454 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-trusted-ca\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422487 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-client-ca\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422510 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422532 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82v86\" (UniqueName: \"kubernetes.io/projected/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-kube-api-access-82v86\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422551 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpw4\" (UniqueName: \"kubernetes.io/projected/f542f82a-9107-4056-9beb-0fcc49df176a-kube-api-access-jjpw4\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422596 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422623 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwqzx\" (UniqueName: \"kubernetes.io/projected/bd3e5544-39fe-4dfe-843f-b9281085274e-kube-api-access-jwqzx\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422640 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-socket-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422675 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-bound-sa-token\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422699 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6c0fb18-dc93-4aed-abd0-55631d324b99-config-volume\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422715 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af3eca47-a30a-4d71-812b-01b5719b08e9-service-ca-bundle\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422731 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3e5544-39fe-4dfe-843f-b9281085274e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422747 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/506aef5c-d853-41d9-94dd-0194fb7ac45a-srv-cert\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422763 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422779 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f90c88e9-7849-4ec4-9df3-311426864686-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422804 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422819 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05432485-7c44-439d-99d6-27fa4f4f3746-apiservice-cert\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422846 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422866 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25a3f596-1eb9-4bdb-afe9-902545ed5197-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422882 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-config\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422896 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szrk4\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-kube-api-access-szrk4\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422913 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/85e1eb9a-c828-43bc-9d44-4071fb0a1210-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pmp6b\" (UID: \"85e1eb9a-c828-43bc-9d44-4071fb0a1210\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422929 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-metrics-certs\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422951 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-service-ca\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422976 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-registry-certificates\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.422991 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423016 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-serving-cert\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423031 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsh2m\" (UniqueName: \"kubernetes.io/projected/85e1eb9a-c828-43bc-9d44-4071fb0a1210-kube-api-access-rsh2m\") pod \"multus-admission-controller-857f4d67dd-pmp6b\" (UID: \"85e1eb9a-c828-43bc-9d44-4071fb0a1210\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423047 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-csi-data-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423073 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-serving-cert\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423088 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-client\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423106 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-config\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423123 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvrsx\" (UniqueName: \"kubernetes.io/projected/98ec7bfa-b340-4f65-ada5-db427bd64a4e-kube-api-access-tvrsx\") pod \"package-server-manager-789f6589d5-lc9tz\" (UID: \"98ec7bfa-b340-4f65-ada5-db427bd64a4e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423140 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-default-certificate\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423156 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-registration-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423176 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423193 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qftc6\" (UniqueName: \"kubernetes.io/projected/12e95361-86af-4218-b7f7-56582c3a17b7-kube-api-access-qftc6\") pod \"control-plane-machine-set-operator-78cbb6b69f-pdls2\" (UID: \"12e95361-86af-4218-b7f7-56582c3a17b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423209 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0d21a97a-a2c1-4296-b080-8af4e4a22638-node-bootstrap-token\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423229 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423246 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a3f596-1eb9-4bdb-afe9-902545ed5197-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423263 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7n9r\" (UniqueName: \"kubernetes.io/projected/b6c0fb18-dc93-4aed-abd0-55631d324b99-kube-api-access-m7n9r\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423280 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/506aef5c-d853-41d9-94dd-0194fb7ac45a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.423316 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfmxg\" (UniqueName: \"kubernetes.io/projected/84d1d573-370c-47e5-aab1-aee630e9aef0-kube-api-access-pfmxg\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.424133 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.424394 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f90c88e9-7849-4ec4-9df3-311426864686-ca-trust-extracted\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.424892 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/abc25820-f936-4169-996c-d337ec58713b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.425021 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-config-volume\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.425108 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6c0fb18-dc93-4aed-abd0-55631d324b99-config-volume\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.425541 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-config\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.425629 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-ca\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.425737 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af3eca47-a30a-4d71-812b-01b5719b08e9-service-ca-bundle\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.426229 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3e5544-39fe-4dfe-843f-b9281085274e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.426387 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-config\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.428904 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-config\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.429784 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-config\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.429806 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-trusted-ca\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.429999 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.430853 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-client-ca\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.433192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/98ec7bfa-b340-4f65-ada5-db427bd64a4e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-lc9tz\" (UID: \"98ec7bfa-b340-4f65-ada5-db427bd64a4e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.433512 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/05432485-7c44-439d-99d6-27fa4f4f3746-tmpfs\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.433520 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:52.933506085 +0000 UTC m=+143.419408448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.435603 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.435771 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.438319 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-registry-certificates\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.438510 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-client\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.438723 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-default-certificate\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.438807 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.439376 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0d21a97a-a2c1-4296-b080-8af4e4a22638-certs\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.439415 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/12e95361-86af-4218-b7f7-56582c3a17b7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pdls2\" (UID: \"12e95361-86af-4218-b7f7-56582c3a17b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.439666 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-serving-cert\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.439960 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25a3f596-1eb9-4bdb-afe9-902545ed5197-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.440715 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-config\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.440853 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-serving-cert\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.441076 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f542f82a-9107-4056-9beb-0fcc49df176a-srv-cert\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.441951 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-metrics-tls\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.442040 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.444439 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/abc25820-f936-4169-996c-d337ec58713b-proxy-tls\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.451538 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-etcd-service-ca\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.459110 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.462989 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/506aef5c-d853-41d9-94dd-0194fb7ac45a-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.466573 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0d21a97a-a2c1-4296-b080-8af4e4a22638-node-bootstrap-token\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.467628 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.467639 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.468336 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6c0fb18-dc93-4aed-abd0-55631d324b99-secret-volume\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.468375 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/85e1eb9a-c828-43bc-9d44-4071fb0a1210-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-pmp6b\" (UID: \"85e1eb9a-c828-43bc-9d44-4071fb0a1210\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.469002 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.469041 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-stats-auth\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.469201 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/05432485-7c44-439d-99d6-27fa4f4f3746-webhook-cert\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.470769 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f90c88e9-7849-4ec4-9df3-311426864686-installation-pull-secrets\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.470972 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0279d8c-fc4c-47c2-88f5-f9c4800d8667-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7pmzq\" (UID: \"e0279d8c-fc4c-47c2-88f5-f9c4800d8667\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.471349 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-serving-cert\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.471394 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e5544-39fe-4dfe-843f-b9281085274e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.471649 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-registry-tls\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.471773 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/506aef5c-d853-41d9-94dd-0194fb7ac45a-srv-cert\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.471834 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/05432485-7c44-439d-99d6-27fa4f4f3746-apiservice-cert\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.473860 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/af3eca47-a30a-4d71-812b-01b5719b08e9-metrics-certs\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.474104 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25a3f596-1eb9-4bdb-afe9-902545ed5197-serving-cert\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.476123 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txlsd\" (UniqueName: \"kubernetes.io/projected/0d21a97a-a2c1-4296-b080-8af4e4a22638-kube-api-access-txlsd\") pod \"machine-config-server-8vpcc\" (UID: \"0d21a97a-a2c1-4296-b080-8af4e4a22638\") " pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.476147 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f542f82a-9107-4056-9beb-0fcc49df176a-profile-collector-cert\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.477152 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.497451 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfmxg\" (UniqueName: \"kubernetes.io/projected/84d1d573-370c-47e5-aab1-aee630e9aef0-kube-api-access-pfmxg\") pod \"marketplace-operator-79b997595-zc4gp\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: W0201 06:49:52.507319 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06682504_c11d_41d1_838a_a336640770a8.slice/crio-506c7d8d1982d04ce9caf44fa1765c365bfe0b7814fb93792ed63153405b0036 WatchSource:0}: Error finding container 506c7d8d1982d04ce9caf44fa1765c365bfe0b7814fb93792ed63153405b0036: Status 404 returned error can't find the container with id 506c7d8d1982d04ce9caf44fa1765c365bfe0b7814fb93792ed63153405b0036 Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.521942 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lblm\" (UniqueName: \"kubernetes.io/projected/abc25820-f936-4169-996c-d337ec58713b-kube-api-access-7lblm\") pod \"machine-config-controller-84d6567774-jnckc\" (UID: \"abc25820-f936-4169-996c-d337ec58713b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524250 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524392 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-csi-data-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524428 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-registration-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524488 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-mountpoint-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524503 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-plugins-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524561 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qls78\" (UniqueName: \"kubernetes.io/projected/339e4434-c20b-49ab-8a89-234c633c788b-kube-api-access-qls78\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524651 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-socket-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.524908 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-socket-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.524990 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.024972464 +0000 UTC m=+143.510874827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.526041 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-mountpoint-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.526116 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-registration-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.526152 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-plugins-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.526250 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/339e4434-c20b-49ab-8a89-234c633c788b-csi-data-dir\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.540964 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.541087 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-w4bsb"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.545031 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwqzx\" (UniqueName: \"kubernetes.io/projected/bd3e5544-39fe-4dfe-843f-b9281085274e-kube-api-access-jwqzx\") pod \"kube-storage-version-migrator-operator-b67b599dd-4svn6\" (UID: \"bd3e5544-39fe-4dfe-843f-b9281085274e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.563790 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8vpcc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.592509 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-bound-sa-token\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.614246 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.615315 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.616313 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz642"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.617737 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-m947v"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.618445 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb55p\" (UniqueName: \"kubernetes.io/projected/cda0b5e8-91e4-4b05-98a9-ce1c064dc80f-kube-api-access-kb55p\") pod \"etcd-operator-b45778765-ct7v5\" (UID: \"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.618702 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-8rw72"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.626101 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d43c7fd9-def1-4cb2-8913-2ffe8019f3fe-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nr4jk\" (UID: \"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.626413 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.626746 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.126725862 +0000 UTC m=+143.612628225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.632012 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzdn7\" (UniqueName: \"kubernetes.io/projected/af3eca47-a30a-4d71-812b-01b5719b08e9-kube-api-access-jzdn7\") pod \"router-default-5444994796-z7mc2\" (UID: \"af3eca47-a30a-4d71-812b-01b5719b08e9\") " pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.650099 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7kbl\" (UniqueName: \"kubernetes.io/projected/506aef5c-d853-41d9-94dd-0194fb7ac45a-kube-api-access-d7kbl\") pod \"olm-operator-6b444d44fb-rqb54\" (UID: \"506aef5c-d853-41d9-94dd-0194fb7ac45a\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.682069 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x76s\" (UniqueName: \"kubernetes.io/projected/0a4eee0e-8edb-4778-9ebe-f54a8ff6d355-kube-api-access-6x76s\") pod \"service-ca-operator-777779d784-8gp5z\" (UID: \"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.694364 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsh2m\" (UniqueName: \"kubernetes.io/projected/85e1eb9a-c828-43bc-9d44-4071fb0a1210-kube-api-access-rsh2m\") pod \"multus-admission-controller-857f4d67dd-pmp6b\" (UID: \"85e1eb9a-c828-43bc-9d44-4071fb0a1210\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.713548 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.719868 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nv98\" (UniqueName: \"kubernetes.io/projected/2d5b2407-6eb7-4bac-8c75-cdd0820f3974-kube-api-access-2nv98\") pod \"migrator-59844c95c7-vcfxq\" (UID: \"2d5b2407-6eb7-4bac-8c75-cdd0820f3974\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.723340 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.728433 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.730213 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.230186358 +0000 UTC m=+143.716088721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.733544 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.734180 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.734478 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.234466059 +0000 UTC m=+143.720368422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.742835 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bb36202-47eb-4283-b1a7-3027e0ff5ae2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-hdpvr\" (UID: \"0bb36202-47eb-4283-b1a7-3027e0ff5ae2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.743349 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.743911 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.749071 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.764140 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.782751 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k5wzh"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.783029 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.783395 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.786056 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvrsx\" (UniqueName: \"kubernetes.io/projected/98ec7bfa-b340-4f65-ada5-db427bd64a4e-kube-api-access-tvrsx\") pod \"package-server-manager-789f6589d5-lc9tz\" (UID: \"98ec7bfa-b340-4f65-ada5-db427bd64a4e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.786873 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.796028 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.803251 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jpdck"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.821562 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82v86\" (UniqueName: \"kubernetes.io/projected/27eef381-a41a-4d53-bc8d-ed1cdc7b9109-kube-api-access-82v86\") pod \"dns-default-5zh5x\" (UID: \"27eef381-a41a-4d53-bc8d-ed1cdc7b9109\") " pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.824961 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.825624 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpw4\" (UniqueName: \"kubernetes.io/projected/f542f82a-9107-4056-9beb-0fcc49df176a-kube-api-access-jjpw4\") pod \"catalog-operator-68c6474976-tm8zv\" (UID: \"f542f82a-9107-4056-9beb-0fcc49df176a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.825855 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-blb44"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.829356 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcxlv\" (UniqueName: \"kubernetes.io/projected/25a3f596-1eb9-4bdb-afe9-902545ed5197-kube-api-access-lcxlv\") pod \"openshift-config-operator-7777fb866f-qxg44\" (UID: \"25a3f596-1eb9-4bdb-afe9-902545ed5197\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.830281 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.834931 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.835274 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.33526257 +0000 UTC m=+143.821164933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.837690 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.838761 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghnrs\" (UniqueName: \"kubernetes.io/projected/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-kube-api-access-ghnrs\") pod \"controller-manager-879f6c89f-mwdsw\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.854916 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nr6g\" (UniqueName: \"kubernetes.io/projected/05432485-7c44-439d-99d6-27fa4f4f3746-kube-api-access-9nr6g\") pod \"packageserver-d55dfcdfc-5fzh2\" (UID: \"05432485-7c44-439d-99d6-27fa4f4f3746\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.859226 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.880236 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.882149 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-zfpgn"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.887926 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:52 crc kubenswrapper[5127]: W0201 06:49:52.888662 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff15e8e3_79fb_4691_99c3_8956ba943381.slice/crio-0b26842c2f534fab14ff35304cb5f6730e262619407f7e8fc06de96bb10e6ca1 WatchSource:0}: Error finding container 0b26842c2f534fab14ff35304cb5f6730e262619407f7e8fc06de96bb10e6ca1: Status 404 returned error can't find the container with id 0b26842c2f534fab14ff35304cb5f6730e262619407f7e8fc06de96bb10e6ca1 Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.889021 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7n9r\" (UniqueName: \"kubernetes.io/projected/b6c0fb18-dc93-4aed-abd0-55631d324b99-kube-api-access-m7n9r\") pod \"collect-profiles-29498805-m62nb\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.897157 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kvg79"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.897210 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms"] Feb 01 06:49:52 crc kubenswrapper[5127]: W0201 06:49:52.902019 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2288fe_2a63_4283_946c_5c5f1891a007.slice/crio-ce4a1934e2a38bdc7108894c3049c435f8f37f18f46370e9d58ae9b9a0054480 WatchSource:0}: Error finding container ce4a1934e2a38bdc7108894c3049c435f8f37f18f46370e9d58ae9b9a0054480: Status 404 returned error can't find the container with id ce4a1934e2a38bdc7108894c3049c435f8f37f18f46370e9d58ae9b9a0054480 Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.902918 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szrk4\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-kube-api-access-szrk4\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.924890 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qftc6\" (UniqueName: \"kubernetes.io/projected/12e95361-86af-4218-b7f7-56582c3a17b7-kube-api-access-qftc6\") pod \"control-plane-machine-set-operator-78cbb6b69f-pdls2\" (UID: \"12e95361-86af-4218-b7f7-56582c3a17b7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.936114 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qls78\" (UniqueName: \"kubernetes.io/projected/339e4434-c20b-49ab-8a89-234c633c788b-kube-api-access-qls78\") pod \"csi-hostpathplugin-rgmvx\" (UID: \"339e4434-c20b-49ab-8a89-234c633c788b\") " pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.936935 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.939005 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:52 crc kubenswrapper[5127]: E0201 06:49:52.939063 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.439049285 +0000 UTC m=+143.924951648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.947393 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:52 crc kubenswrapper[5127]: W0201 06:49:52.953777 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e586b38_fbcb_4142_b1b0_46f7824f9cc5.slice/crio-29a2ef7c0f60df6f452de84e8322ca6b667f8ea3a3585383878aea10a875c015 WatchSource:0}: Error finding container 29a2ef7c0f60df6f452de84e8322ca6b667f8ea3a3585383878aea10a875c015: Status 404 returned error can't find the container with id 29a2ef7c0f60df6f452de84e8322ca6b667f8ea3a3585383878aea10a875c015 Feb 01 06:49:52 crc kubenswrapper[5127]: W0201 06:49:52.957227 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c9cfc0f_f3e7_4a7a_860a_5961cfa3ab6c.slice/crio-eefbccab9daf461b9b4a42f0c7b324a33241793ed1d4136719d64608a4105d63 WatchSource:0}: Error finding container eefbccab9daf461b9b4a42f0c7b324a33241793ed1d4136719d64608a4105d63: Status 404 returned error can't find the container with id eefbccab9daf461b9b4a42f0c7b324a33241793ed1d4136719d64608a4105d63 Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.964280 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zc4gp"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.984317 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6"] Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.990110 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8vpcc" event={"ID":"0d21a97a-a2c1-4296-b080-8af4e4a22638","Type":"ContainerStarted","Data":"e5f1726d1a7d41ccbc7f6d77fcc8fb67fbf62e6bd3268f9a467c8189eef1a071"} Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.990157 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8vpcc" event={"ID":"0d21a97a-a2c1-4296-b080-8af4e4a22638","Type":"ContainerStarted","Data":"043d459ed108364cef4b72d07e6445fb1b78663936de023e5c8894e1a7d3849c"} Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.994148 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zfpgn" event={"ID":"99bc3500-5fb6-4d26-97dd-24dc06658294","Type":"ContainerStarted","Data":"7d61dcbe3da3e2a463fdd846be5ce3880fc51e835b205f6c3f58c65bfadeb8fc"} Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.997728 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" event={"ID":"04c86aea-4121-437a-98a3-e7d14024c548","Type":"ContainerStarted","Data":"32a4f4c352ae01baff8ba1ebd611cd4180cea0a361179c15f0a3d77fd4c50462"} Feb 01 06:49:52 crc kubenswrapper[5127]: I0201 06:49:52.997769 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" event={"ID":"04c86aea-4121-437a-98a3-e7d14024c548","Type":"ContainerStarted","Data":"3570226c122b046ff118d3d8d517abefd05593f166aa52b6f281905b347bb58a"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:52.998861 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jpdck" event={"ID":"bd2288fe-2a63-4283-946c-5c5f1891a007","Type":"ContainerStarted","Data":"ce4a1934e2a38bdc7108894c3049c435f8f37f18f46370e9d58ae9b9a0054480"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:52.999975 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" event={"ID":"ef8c34c0-f858-472e-8560-5e7806b32eab","Type":"ContainerStarted","Data":"46d8a013987bc5c13505df9e8dcdcce4985b895d83011293af9c42c5f750e8c7"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.000972 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" event={"ID":"81a435a5-ae11-4994-86f9-cdabafa80e4f","Type":"ContainerStarted","Data":"e6d3524ae6f3cd3f8a867e0de0b0501258e9835812f24361c542517ff74d8965"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.003829 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" event={"ID":"5e364462-04d8-45ed-b896-7a82db12c738","Type":"ContainerStarted","Data":"f991c6eaa81979b0db441927da29b1edf22a34f4791bf960a04108494a36c654"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.003886 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" event={"ID":"5e364462-04d8-45ed-b896-7a82db12c738","Type":"ContainerStarted","Data":"93465cb33b7d993d7b5d19e1fa82e14aa56c96b2dd1b98fbe645b211db39b1f6"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.007789 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" event={"ID":"ded37ad4-73dd-4eba-967f-ce65cb3385bb","Type":"ContainerStarted","Data":"5aa321920fc63c2171d52754079887863eb2143b716554b48c22a993026611b1"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.009808 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" event={"ID":"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c","Type":"ContainerStarted","Data":"eefbccab9daf461b9b4a42f0c7b324a33241793ed1d4136719d64608a4105d63"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.012659 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-blb44" event={"ID":"501ce6ad-6013-458e-b398-d0b8ca7c1915","Type":"ContainerStarted","Data":"aa5137ec9c9224b17bb1fdcea4631dccb300e196b063b2a3475d813a7077898f"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.013570 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" event={"ID":"ff15e8e3-79fb-4691-99c3-8956ba943381","Type":"ContainerStarted","Data":"0b26842c2f534fab14ff35304cb5f6730e262619407f7e8fc06de96bb10e6ca1"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.020040 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-z8hkn"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.035517 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" event={"ID":"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45","Type":"ContainerStarted","Data":"aa56f3ef14bc2afa81419f6a1213a37edea8bb7d2b064a06c783e38bc2ac4c61"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.038012 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.038230 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.53820449 +0000 UTC m=+144.024106853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.038549 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.039437 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.539427675 +0000 UTC m=+144.025330038 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.042024 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-m947v" event={"ID":"c828d93e-bd87-4588-80af-0f6e69f9c81f","Type":"ContainerStarted","Data":"700698dbb9f140ad8379b33910afd309f44bbd8b69175da292dcbab243b58f78"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.043758 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.047634 5127 patch_prober.go:28] interesting pod/console-operator-58897d9998-m947v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.047711 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-m947v" podUID="c828d93e-bd87-4588-80af-0f6e69f9c81f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.061557 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.079893 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" event={"ID":"06682504-c11d-41d1-838a-a336640770a8","Type":"ContainerStarted","Data":"344e8cf1b56504e4f618103bc3bad4ebd1f6fc58d6453612a2b7c7203fb31887"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.079940 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" event={"ID":"06682504-c11d-41d1-838a-a336640770a8","Type":"ContainerStarted","Data":"506c7d8d1982d04ce9caf44fa1765c365bfe0b7814fb93792ed63153405b0036"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.080384 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.081721 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" event={"ID":"d7eea465-934a-4ec6-8eeb-2d7199fc3594","Type":"ContainerStarted","Data":"65a8f5bd73fc7a31392a8c191c7d02b36bc4500c7521783eb3a6b23ca5528c32"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.085466 5127 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mpbrq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.085483 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8rw72" event={"ID":"27bef954-7005-4626-9049-195b48a9365b","Type":"ContainerStarted","Data":"2e2b9540fe2b1a8012e0638a5db637bfa3a3c1ae84be37e1808dfb8e357e0899"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.085507 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" podUID="06682504-c11d-41d1-838a-a336640770a8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.087104 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" event={"ID":"3e586b38-fbcb-4142-b1b0-46f7824f9cc5","Type":"ContainerStarted","Data":"29a2ef7c0f60df6f452de84e8322ca6b667f8ea3a3585383878aea10a875c015"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.088431 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" event={"ID":"9b48d863-39c0-40ca-b016-2c92f284eace","Type":"ContainerStarted","Data":"fc4ab22f71ab607a4734042800449fb64218cfbd07ed433550d1e14eb4805cba"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.088873 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.089870 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" event={"ID":"015e8567-adb7-421c-8613-1611c4768cbe","Type":"ContainerStarted","Data":"18edf5c06cfdb4ddba77bd726dab5c85d0e130e24a2a90dabcff482a6f8e10b3"} Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.139802 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.139958 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.639933709 +0000 UTC m=+144.125836072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.141317 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.142373 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.642329966 +0000 UTC m=+144.128232329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.155573 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:53 crc kubenswrapper[5127]: W0201 06:49:53.156947 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc57b8f_e457_482e_8c41_27fbb779b6a5.slice/crio-2d0ec90b25ba6aade0cf13e4356bd745820760f20a332566333da1d47990ecda WatchSource:0}: Error finding container 2d0ec90b25ba6aade0cf13e4356bd745820760f20a332566333da1d47990ecda: Status 404 returned error can't find the container with id 2d0ec90b25ba6aade0cf13e4356bd745820760f20a332566333da1d47990ecda Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.202373 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.243020 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.244743 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.744726832 +0000 UTC m=+144.230629195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.282301 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.346317 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.348169 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.848145957 +0000 UTC m=+144.334048340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.447324 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.447786 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.947731075 +0000 UTC m=+144.433633438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.448115 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.448660 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:53.948652211 +0000 UTC m=+144.434554574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.448665 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.531121 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.549246 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.549694 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.049678168 +0000 UTC m=+144.535580531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.551148 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.622296 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.651132 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.651462 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.151450307 +0000 UTC m=+144.637352660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: W0201 06:49:53.707705 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506aef5c_d853_41d9_94dd_0194fb7ac45a.slice/crio-0e67c97e1f45630d575972a937017748a25d27855f2c2bfb49beeb4fccd7d9c3 WatchSource:0}: Error finding container 0e67c97e1f45630d575972a937017748a25d27855f2c2bfb49beeb4fccd7d9c3: Status 404 returned error can't find the container with id 0e67c97e1f45630d575972a937017748a25d27855f2c2bfb49beeb4fccd7d9c3 Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.752201 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.753165 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.753927 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.253902425 +0000 UTC m=+144.739804778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.768733 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ct7v5"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.768783 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-pmp6b"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.857440 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.857963 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.357946708 +0000 UTC m=+144.843849071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.944068 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwdsw"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.949931 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.958447 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz"] Feb 01 06:49:53 crc kubenswrapper[5127]: I0201 06:49:53.958487 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:53 crc kubenswrapper[5127]: E0201 06:49:53.958810 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.45879122 +0000 UTC m=+144.944693583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.017954 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5zh5x"] Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.060437 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.060802 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.560791315 +0000 UTC m=+145.046693678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.067898 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2"] Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.069193 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-m947v" podStartSLOduration=124.069179521 podStartE2EDuration="2m4.069179521s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.068953625 +0000 UTC m=+144.554855988" watchObservedRunningTime="2026-02-01 06:49:54.069179521 +0000 UTC m=+144.555081884" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.160000 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" podStartSLOduration=124.159956761 podStartE2EDuration="2m4.159956761s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.1410438 +0000 UTC m=+144.626946163" watchObservedRunningTime="2026-02-01 06:49:54.159956761 +0000 UTC m=+144.645859124" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.171403 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rgmvx"] Feb 01 06:49:54 crc kubenswrapper[5127]: W0201 06:49:54.171978 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05432485_7c44_439d_99d6_27fa4f4f3746.slice/crio-3489e8f32b4af3729a54dfb25308d0d03e702d7b453ce49e31f0b31c2baf8171 WatchSource:0}: Error finding container 3489e8f32b4af3729a54dfb25308d0d03e702d7b453ce49e31f0b31c2baf8171: Status 404 returned error can't find the container with id 3489e8f32b4af3729a54dfb25308d0d03e702d7b453ce49e31f0b31c2baf8171 Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.175571 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.188514 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" event={"ID":"04c86aea-4121-437a-98a3-e7d14024c548","Type":"ContainerStarted","Data":"cfada7397ddf561b51ee820884cfdd3d190b14d478a5da7c5a5ce1998f91475d"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.196282 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" event={"ID":"4bc57b8f-e457-482e-8c41-27fbb779b6a5","Type":"ContainerStarted","Data":"2d0ec90b25ba6aade0cf13e4356bd745820760f20a332566333da1d47990ecda"} Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.198392 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.69835202 +0000 UTC m=+145.184254393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.209183 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" event={"ID":"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f","Type":"ContainerStarted","Data":"25f4e9b54eda67e4865ab05e7bc167c20bdff05042f9e99d224409a92dcb3f00"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.227535 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb"] Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.249288 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8vpcc" podStartSLOduration=5.24927288 podStartE2EDuration="5.24927288s" podCreationTimestamp="2026-02-01 06:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.247762037 +0000 UTC m=+144.733664400" watchObservedRunningTime="2026-02-01 06:49:54.24927288 +0000 UTC m=+144.735175243" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.258375 5127 generic.go:334] "Generic (PLEG): container finished" podID="b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45" containerID="131a1f07f6e08b3b4391da86c8bc1706bfc8d3c4ef29c7821989f5205154884d" exitCode=0 Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.265485 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" event={"ID":"81a435a5-ae11-4994-86f9-cdabafa80e4f","Type":"ContainerStarted","Data":"3acc33fee005d710f5ed476f154b8da18e3b2f8f2306933484c2aaf5db5b9688"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.275098 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qxg44"] Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.275185 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.275196 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-blb44" event={"ID":"501ce6ad-6013-458e-b398-d0b8ca7c1915","Type":"ContainerStarted","Data":"d847b3f46ac7c5303a4e4a07be5dc5dbdb90d44a98a44a9f9b3c3695829c11d2"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.275214 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" event={"ID":"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45","Type":"ContainerDied","Data":"131a1f07f6e08b3b4391da86c8bc1706bfc8d3c4ef29c7821989f5205154884d"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.275226 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" event={"ID":"84d1d573-370c-47e5-aab1-aee630e9aef0","Type":"ContainerStarted","Data":"8ceaa3d49815fb4a8ac9d12178e69e8f1dfb60126abfb336a8321bafe4fa3077"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.275238 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" event={"ID":"84d1d573-370c-47e5-aab1-aee630e9aef0","Type":"ContainerStarted","Data":"79438ca8d045e5413d268500e0cbbf6a39a98413ea29ec379579c4da130a211f"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.277014 5127 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zc4gp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.277116 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.277351 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" event={"ID":"9b48d863-39c0-40ca-b016-2c92f284eace","Type":"ContainerStarted","Data":"64cdea247cd67a806dc9581d014fe01341e8243a4aa72af65fa13573d4599a49"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.282690 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv"] Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.283307 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2"] Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.299643 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.302518 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.802507986 +0000 UTC m=+145.288410349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.311975 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-8rw72" event={"ID":"27bef954-7005-4626-9049-195b48a9365b","Type":"ContainerStarted","Data":"5ed406b59a7c1ebaac58b48a1fb8a5f531f050a6f8c1b9c81e49da551b5de03a"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.316353 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-8rw72" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.318220 5127 patch_prober.go:28] interesting pod/downloads-7954f5f757-8rw72 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.318415 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8rw72" podUID="27bef954-7005-4626-9049-195b48a9365b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.326099 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jpdck" event={"ID":"bd2288fe-2a63-4283-946c-5c5f1891a007","Type":"ContainerStarted","Data":"e0b9dfa7e18152937aa85c9f9c662dabe350562973c0c36cd60b452cfb2a9e6a"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.337053 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" event={"ID":"98ec7bfa-b340-4f65-ada5-db427bd64a4e","Type":"ContainerStarted","Data":"9a6d700a7f21132739b85c88f1208ee8858f5569fe982dccbceb1021d2b26162"} Feb 01 06:49:54 crc kubenswrapper[5127]: W0201 06:49:54.338346 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c0fb18_dc93_4aed_abd0_55631d324b99.slice/crio-5ad82ca012798110fdd6f372d70c6b940d86a91b49c8c927aadcdf90c15736be WatchSource:0}: Error finding container 5ad82ca012798110fdd6f372d70c6b940d86a91b49c8c927aadcdf90c15736be: Status 404 returned error can't find the container with id 5ad82ca012798110fdd6f372d70c6b940d86a91b49c8c927aadcdf90c15736be Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.343037 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z7mc2" event={"ID":"af3eca47-a30a-4d71-812b-01b5719b08e9","Type":"ContainerStarted","Data":"fe09af8ea40660ef323b8dcdb789c8deba9940aa722b066dc5742bee76f5a410"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.372780 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" event={"ID":"ded37ad4-73dd-4eba-967f-ce65cb3385bb","Type":"ContainerStarted","Data":"0c083250eb7c71469f6e3990593c4faade3366c8575e74f296782242e492af64"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.381380 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" event={"ID":"d7eea465-934a-4ec6-8eeb-2d7199fc3594","Type":"ContainerStarted","Data":"05fdaa750a87bdf60c313406424a985a403ee5199fb4a7c2dae62ff37c0a35b0"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.393176 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" event={"ID":"015e8567-adb7-421c-8613-1611c4768cbe","Type":"ContainerStarted","Data":"ed4069d38165f1ed1bea3191fe22fb22b06456f1d4abc9cfd9e5dafcb2bdc59b"} Feb 01 06:49:54 crc kubenswrapper[5127]: W0201 06:49:54.393520 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf542f82a_9107_4056_9beb_0fcc49df176a.slice/crio-e6907089b2a6038ace398f0339d53d358b16e529b5e02c39f453afb88ee101ac WatchSource:0}: Error finding container e6907089b2a6038ace398f0339d53d358b16e529b5e02c39f453afb88ee101ac: Status 404 returned error can't find the container with id e6907089b2a6038ace398f0339d53d358b16e529b5e02c39f453afb88ee101ac Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.400628 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.402721 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:54.90270509 +0000 UTC m=+145.388607453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.414966 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-m947v" event={"ID":"c828d93e-bd87-4588-80af-0f6e69f9c81f","Type":"ContainerStarted","Data":"463758d86cd7453ddc1313f2038d591274977a941453f3ab24e5dddff84855b0"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.415722 5127 patch_prober.go:28] interesting pod/console-operator-58897d9998-m947v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.415775 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-m947v" podUID="c828d93e-bd87-4588-80af-0f6e69f9c81f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/readyz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.448205 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zfpgn" event={"ID":"99bc3500-5fb6-4d26-97dd-24dc06658294","Type":"ContainerStarted","Data":"c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.450507 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" event={"ID":"abc25820-f936-4169-996c-d337ec58713b","Type":"ContainerStarted","Data":"71acaf70f6a5379f92b7353aa93c98d826e94b9e9a7d062f6a479aa8f229589c"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.464721 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" event={"ID":"ef8c34c0-f858-472e-8560-5e7806b32eab","Type":"ContainerStarted","Data":"4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.465327 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.476938 5127 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-kz642 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.476990 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.479445 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" event={"ID":"5e364462-04d8-45ed-b896-7a82db12c738","Type":"ContainerStarted","Data":"e8e9d62dbd141fc41f4a211ff9a0fc413319c36c176be291c413cd7849d97bbb"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.485298 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" event={"ID":"e0279d8c-fc4c-47c2-88f5-f9c4800d8667","Type":"ContainerStarted","Data":"3bef34551faa77ba0cc49d8165bd9ad97366e18bb72580f21817c5bb14cdb73d"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.488884 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5zh5x" event={"ID":"27eef381-a41a-4d53-bc8d-ed1cdc7b9109","Type":"ContainerStarted","Data":"1b190b0866ab136a3e30f6f2bf88f743815b9b78c3b8c6ee38980133d88cf481"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.490227 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" event={"ID":"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355","Type":"ContainerStarted","Data":"f6d60aa5d30dda47756a1208fb11fb8406d0b517c5bff115db47ba08edf42459"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.497415 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" event={"ID":"bd3e5544-39fe-4dfe-843f-b9281085274e","Type":"ContainerStarted","Data":"c3c7b327138599de33e3d815350cb6ee991f961f9ab5836ef212efc90983850e"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.499559 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" event={"ID":"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19","Type":"ContainerStarted","Data":"32ee07ac1a814b003b417efa26b8aa0b63313ec96c2604225ce78123c0e1632f"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.503697 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.505543 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.005520798 +0000 UTC m=+145.491423331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.513331 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" event={"ID":"ff15e8e3-79fb-4691-99c3-8956ba943381","Type":"ContainerStarted","Data":"1825b457cd324783e560febbcefc1207e211fad20bf86d73e1a867183d4e0ffa"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.517427 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" event={"ID":"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c","Type":"ContainerStarted","Data":"a446f6d863a37871ad2499641545ca42bf039384ae5131e4f3edf0e188ac9c15"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.520136 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" event={"ID":"506aef5c-d853-41d9-94dd-0194fb7ac45a","Type":"ContainerStarted","Data":"0e67c97e1f45630d575972a937017748a25d27855f2c2bfb49beeb4fccd7d9c3"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.525703 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" event={"ID":"2d5b2407-6eb7-4bac-8c75-cdd0820f3974","Type":"ContainerStarted","Data":"d61d86eb99e6b586d7aa60484c6e7c9fd086a0a65cdf5b68d295e3ac99a9042d"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.527434 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" event={"ID":"85e1eb9a-c828-43bc-9d44-4071fb0a1210","Type":"ContainerStarted","Data":"d0b818e5bf1dc94ecc8b7b13f90789bd73df91502cc74336eebd3da352fa5ec7"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.529843 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" event={"ID":"0bb36202-47eb-4283-b1a7-3027e0ff5ae2","Type":"ContainerStarted","Data":"5755e9c1a04f38d5ec443a8743c4d27df888d9222b78a68900052e2ff15e6401"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.531203 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" event={"ID":"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe","Type":"ContainerStarted","Data":"a45158c9744069406a2649dcf7fa1faf753ccfb8ff8cad9244302b11271ae836"} Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.537303 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.605261 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.605832 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.105786075 +0000 UTC m=+145.591688438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.706897 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.707242 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.207230914 +0000 UTC m=+145.693133277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.748069 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-blb44" podStartSLOduration=124.748051181 podStartE2EDuration="2m4.748051181s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.746916059 +0000 UTC m=+145.232818422" watchObservedRunningTime="2026-02-01 06:49:54.748051181 +0000 UTC m=+145.233953544" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.800162 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-k5wzh" podStartSLOduration=125.800139714 podStartE2EDuration="2m5.800139714s" podCreationTimestamp="2026-02-01 06:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.797197511 +0000 UTC m=+145.283099874" watchObservedRunningTime="2026-02-01 06:49:54.800139714 +0000 UTC m=+145.286042077" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.810161 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.810530 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.310514746 +0000 UTC m=+145.796417109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.828442 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" podStartSLOduration=124.828422578 podStartE2EDuration="2m4.828422578s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.827861833 +0000 UTC m=+145.313764196" watchObservedRunningTime="2026-02-01 06:49:54.828422578 +0000 UTC m=+145.314324941" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.913075 5127 csr.go:261] certificate signing request csr-66ckz is approved, waiting to be issued Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.913096 5127 csr.go:257] certificate signing request csr-66ckz is issued Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.913770 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:54 crc kubenswrapper[5127]: E0201 06:49:54.914052 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.414041204 +0000 UTC m=+145.899943567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.914451 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lhf7v" podStartSLOduration=125.914431465 podStartE2EDuration="2m5.914431465s" podCreationTimestamp="2026-02-01 06:47:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.88045296 +0000 UTC m=+145.366355323" watchObservedRunningTime="2026-02-01 06:49:54.914431465 +0000 UTC m=+145.400333828" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.915005 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-zfpgn" podStartSLOduration=124.91500207 podStartE2EDuration="2m4.91500207s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.913137089 +0000 UTC m=+145.399039452" watchObservedRunningTime="2026-02-01 06:49:54.91500207 +0000 UTC m=+145.400904433" Feb 01 06:49:54 crc kubenswrapper[5127]: I0201 06:49:54.947501 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-8rw72" podStartSLOduration=124.947481643 podStartE2EDuration="2m4.947481643s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:54.946080973 +0000 UTC m=+145.431983326" watchObservedRunningTime="2026-02-01 06:49:54.947481643 +0000 UTC m=+145.433384006" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.015599 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.015806 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.515769211 +0000 UTC m=+146.001671574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.015935 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.016321 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.516301786 +0000 UTC m=+146.002204149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.052843 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" podStartSLOduration=125.052812162 podStartE2EDuration="2m5.052812162s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.051505425 +0000 UTC m=+145.537407788" watchObservedRunningTime="2026-02-01 06:49:55.052812162 +0000 UTC m=+145.538714525" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.109146 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-czrhc" podStartSLOduration=125.109127814 podStartE2EDuration="2m5.109127814s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.106235833 +0000 UTC m=+145.592138196" watchObservedRunningTime="2026-02-01 06:49:55.109127814 +0000 UTC m=+145.595030177" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.117481 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.118031 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.618009813 +0000 UTC m=+146.103912176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.158150 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f8hq9" podStartSLOduration=125.15812706 podStartE2EDuration="2m5.15812706s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.156129154 +0000 UTC m=+145.642031517" watchObservedRunningTime="2026-02-01 06:49:55.15812706 +0000 UTC m=+145.644029423" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.202150 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jpdck" podStartSLOduration=6.202112106 podStartE2EDuration="6.202112106s" podCreationTimestamp="2026-02-01 06:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.195882331 +0000 UTC m=+145.681784694" watchObservedRunningTime="2026-02-01 06:49:55.202112106 +0000 UTC m=+145.688014469" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.220234 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.220542 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.720531153 +0000 UTC m=+146.206433516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.276912 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-mcc7n" podStartSLOduration=125.276895337 podStartE2EDuration="2m5.276895337s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.228063395 +0000 UTC m=+145.713965758" watchObservedRunningTime="2026-02-01 06:49:55.276895337 +0000 UTC m=+145.762797700" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.277019 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdjnm" podStartSLOduration=125.27701403 podStartE2EDuration="2m5.27701403s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.274413387 +0000 UTC m=+145.760315750" watchObservedRunningTime="2026-02-01 06:49:55.27701403 +0000 UTC m=+145.762916393" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.321375 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.321527 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.821492399 +0000 UTC m=+146.307394762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.322398 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.322797 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.822781405 +0000 UTC m=+146.308683768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.428102 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.428483 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:55.928468955 +0000 UTC m=+146.414371318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.529913 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.530604 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.030574562 +0000 UTC m=+146.516476925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.565920 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" event={"ID":"015e8567-adb7-421c-8613-1611c4768cbe","Type":"ContainerStarted","Data":"a85a8c595f76b25b7497f0d143a1fd74cb9cd1602abffb6b37a97355eb42be4d"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.572186 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-z7mc2" event={"ID":"af3eca47-a30a-4d71-812b-01b5719b08e9","Type":"ContainerStarted","Data":"9f8456ebe33585d098e596f39b4bd269d2d76a36b9f3329e60f42f89157a1967"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.607707 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" event={"ID":"2d5b2407-6eb7-4bac-8c75-cdd0820f3974","Type":"ContainerStarted","Data":"68c04eca62d91e48887c9b6e362017b9a76ab9dc310f410587497f6f1ad65433"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.607754 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" event={"ID":"2d5b2407-6eb7-4bac-8c75-cdd0820f3974","Type":"ContainerStarted","Data":"caa3216250f424045079b5ca2e65154ed7d38464cf65c854ad62c5211820c751"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.617594 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" podStartSLOduration=125.617558146 podStartE2EDuration="2m5.617558146s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.320923593 +0000 UTC m=+145.806825956" watchObservedRunningTime="2026-02-01 06:49:55.617558146 +0000 UTC m=+146.103460509" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.618066 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5pg2r" podStartSLOduration=125.618059379 podStartE2EDuration="2m5.618059379s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.604722246 +0000 UTC m=+146.090624609" watchObservedRunningTime="2026-02-01 06:49:55.618059379 +0000 UTC m=+146.103961732" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.635995 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.637325 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.13730504 +0000 UTC m=+146.623207403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.660030 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" event={"ID":"85e1eb9a-c828-43bc-9d44-4071fb0a1210","Type":"ContainerStarted","Data":"8ab5f9b9e8a086534c3810f305cf728a9dac3de093850a28fe51784cd5159364"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.669930 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" event={"ID":"2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c","Type":"ContainerStarted","Data":"f796accf3af2edbc488a67ec001dbf514b30fd2328659bb78354b6dec31d116f"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.679304 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" event={"ID":"25a3f596-1eb9-4bdb-afe9-902545ed5197","Type":"ContainerStarted","Data":"6ec5945a268bda76ce78ed15012b934d30eaf49dff1b01847272084240448067"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.679374 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" event={"ID":"25a3f596-1eb9-4bdb-afe9-902545ed5197","Type":"ContainerStarted","Data":"658b4bea2c99825a86fe35948f264658101195b95afe0094becb8c3d171734bd"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.681327 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-z7mc2" podStartSLOduration=125.681304116 podStartE2EDuration="2m5.681304116s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.654052651 +0000 UTC m=+146.139955014" watchObservedRunningTime="2026-02-01 06:49:55.681304116 +0000 UTC m=+146.167206479" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.695116 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vcfxq" podStartSLOduration=125.695072923 podStartE2EDuration="2m5.695072923s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.677966833 +0000 UTC m=+146.163869196" watchObservedRunningTime="2026-02-01 06:49:55.695072923 +0000 UTC m=+146.180975286" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.702116 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kvg79" podStartSLOduration=125.70210051 podStartE2EDuration="2m5.70210051s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.700547307 +0000 UTC m=+146.186449670" watchObservedRunningTime="2026-02-01 06:49:55.70210051 +0000 UTC m=+146.188002873" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.710459 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" event={"ID":"d7eea465-934a-4ec6-8eeb-2d7199fc3594","Type":"ContainerStarted","Data":"3102e5788098e463941ad0891068d6e88553fd11f5ced9103526a261b23244a3"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.746289 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.747922 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.247906177 +0000 UTC m=+146.733808540 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.752713 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" event={"ID":"b6c0fb18-dc93-4aed-abd0-55631d324b99","Type":"ContainerStarted","Data":"a428cffef45a764e039c470bf17d5dd774730f225654e331579ef58a51de2f36"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.759939 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" event={"ID":"b6c0fb18-dc93-4aed-abd0-55631d324b99","Type":"ContainerStarted","Data":"5ad82ca012798110fdd6f372d70c6b940d86a91b49c8c927aadcdf90c15736be"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.769106 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-r6zts" podStartSLOduration=125.769082002 podStartE2EDuration="2m5.769082002s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.765125701 +0000 UTC m=+146.251028064" watchObservedRunningTime="2026-02-01 06:49:55.769082002 +0000 UTC m=+146.254984365" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.779883 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" event={"ID":"e0279d8c-fc4c-47c2-88f5-f9c4800d8667","Type":"ContainerStarted","Data":"bcde4b7f9b5cf808e340e4411d308e91fcdb6f32abb74ff2c86804962955f5d9"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.784053 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.800919 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" podStartSLOduration=125.800899966 podStartE2EDuration="2m5.800899966s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.797064888 +0000 UTC m=+146.282967251" watchObservedRunningTime="2026-02-01 06:49:55.800899966 +0000 UTC m=+146.286802329" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.809953 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:49:55 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:49:55 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:49:55 crc kubenswrapper[5127]: healthz check failed Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.809997 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.811025 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" event={"ID":"12e95361-86af-4218-b7f7-56582c3a17b7","Type":"ContainerStarted","Data":"6437a9ead57adec8a8a27dbe4a4a467ca85091ece7eafe6b6e72a7a5fef4e2f2"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.811057 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" event={"ID":"12e95361-86af-4218-b7f7-56582c3a17b7","Type":"ContainerStarted","Data":"8ae934e281fba6a4b4f637401d9d484ca18a738fec8eb174bfd98834936e020a"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.818117 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" event={"ID":"339e4434-c20b-49ab-8a89-234c633c788b","Type":"ContainerStarted","Data":"f1a0f48a6c57523f1b2339c468bc5b2f0652516577c63862e9f660a73e25b3c1"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.852215 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.853474 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.353457682 +0000 UTC m=+146.839360045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.863165 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7pmzq" podStartSLOduration=125.863146254 podStartE2EDuration="2m5.863146254s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.827971536 +0000 UTC m=+146.313873899" watchObservedRunningTime="2026-02-01 06:49:55.863146254 +0000 UTC m=+146.349048617" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.877562 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" event={"ID":"abc25820-f936-4169-996c-d337ec58713b","Type":"ContainerStarted","Data":"b932bcb007d7960aa92f7650f8e7c2f2c0f677aaa491c8c2dd3cd17c1d9332a3"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.880004 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" event={"ID":"506aef5c-d853-41d9-94dd-0194fb7ac45a","Type":"ContainerStarted","Data":"a1223f26707f36d2173e24a3ea73afb7afc8f68417269288b9fae9787a6cf176"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.881185 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.886037 5127 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rqb54 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.886086 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" podUID="506aef5c-d853-41d9-94dd-0194fb7ac45a" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.911785 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pdls2" podStartSLOduration=125.91176827 podStartE2EDuration="2m5.91176827s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.863605927 +0000 UTC m=+146.349508290" watchObservedRunningTime="2026-02-01 06:49:55.91176827 +0000 UTC m=+146.397670633" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.913966 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-01 06:44:54 +0000 UTC, rotation deadline is 2026-11-10 00:31:22.165656758 +0000 UTC Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.913998 5127 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6761h41m26.251661785s for next certificate rotation Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.929130 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" event={"ID":"cda0b5e8-91e4-4b05-98a9-ce1c064dc80f","Type":"ContainerStarted","Data":"a2bb23bad928b52091f3520624d0e28e32dba30b9beb8ac03fd0755b544b3aaa"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.956310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:55 crc kubenswrapper[5127]: E0201 06:49:55.957647 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.457624739 +0000 UTC m=+146.943527102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.972936 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" event={"ID":"98ec7bfa-b340-4f65-ada5-db427bd64a4e","Type":"ContainerStarted","Data":"3345f8ba8501e1daca375667e4d0c020be66c4cb135ff90413a3d62a9a0fd985"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.996952 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" event={"ID":"05432485-7c44-439d-99d6-27fa4f4f3746","Type":"ContainerStarted","Data":"c9d318b88c1d4408da06629d2b5a45351d4111981d01a892db143324872bf3fa"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.997003 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" event={"ID":"05432485-7c44-439d-99d6-27fa4f4f3746","Type":"ContainerStarted","Data":"3489e8f32b4af3729a54dfb25308d0d03e702d7b453ce49e31f0b31c2baf8171"} Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.997972 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.999240 5127 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5fzh2 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Feb 01 06:49:55 crc kubenswrapper[5127]: I0201 06:49:55.999272 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" podUID="05432485-7c44-439d-99d6-27fa4f4f3746" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.010211 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" podStartSLOduration=126.010193815 podStartE2EDuration="2m6.010193815s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:55.912276514 +0000 UTC m=+146.398178877" watchObservedRunningTime="2026-02-01 06:49:56.010193815 +0000 UTC m=+146.496096178" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.016094 5127 generic.go:334] "Generic (PLEG): container finished" podID="3e586b38-fbcb-4142-b1b0-46f7824f9cc5" containerID="6e483b350fecda58908df00a0b5a83296cd69e838ea88cb545a472d89d853bf9" exitCode=0 Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.016264 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" event={"ID":"3e586b38-fbcb-4142-b1b0-46f7824f9cc5","Type":"ContainerDied","Data":"6e483b350fecda58908df00a0b5a83296cd69e838ea88cb545a472d89d853bf9"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.042055 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" event={"ID":"0bb36202-47eb-4283-b1a7-3027e0ff5ae2","Type":"ContainerStarted","Data":"b7e90b1e92f94b84c2ca71683ede14ad9981fc109c252e7b67d5842d5bc57dee"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.057992 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.059422 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.559407488 +0000 UTC m=+147.045309851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.073908 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" event={"ID":"0a4eee0e-8edb-4778-9ebe-f54a8ff6d355","Type":"ContainerStarted","Data":"cb372aa0f8d12eacaaa28be69dc86650a9fb5814552cce1de45c4c2b70e49852"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.075263 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ct7v5" podStartSLOduration=126.075245072 podStartE2EDuration="2m6.075245072s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.011648596 +0000 UTC m=+146.497550959" watchObservedRunningTime="2026-02-01 06:49:56.075245072 +0000 UTC m=+146.561147435" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.103678 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" event={"ID":"d43c7fd9-def1-4cb2-8913-2ffe8019f3fe","Type":"ContainerStarted","Data":"6b0a8e20496cfe14c55de5f61637070d19fe16f6414ca26f6a9fdcea2f83a115"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.107152 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" podStartSLOduration=126.107141588 podStartE2EDuration="2m6.107141588s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.077083104 +0000 UTC m=+146.562985467" watchObservedRunningTime="2026-02-01 06:49:56.107141588 +0000 UTC m=+146.593043951" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.124402 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" event={"ID":"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19","Type":"ContainerStarted","Data":"34f6cce712a98ed1bf252839d852b104a9251477ae9239a79818d89dfcd5d94a"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.125743 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.130933 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4svn6" event={"ID":"bd3e5544-39fe-4dfe-843f-b9281085274e","Type":"ContainerStarted","Data":"8a3876ca5c28ca2556a5bb8b89b73b520da401b8d60b54304a8f9345535eb882"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.152067 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" event={"ID":"f542f82a-9107-4056-9beb-0fcc49df176a","Type":"ContainerStarted","Data":"8cb19977ac2efbd6fe4558ad08352839260bf652bbd928d2ebbafe8140f4b5b3"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.152139 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" event={"ID":"f542f82a-9107-4056-9beb-0fcc49df176a","Type":"ContainerStarted","Data":"e6907089b2a6038ace398f0339d53d358b16e529b5e02c39f453afb88ee101ac"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.152335 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.153613 5127 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mwdsw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.153674 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.154775 5127 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-tm8zv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.154813 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" podUID="f542f82a-9107-4056-9beb-0fcc49df176a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.166069 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.167695 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.667679108 +0000 UTC m=+147.153581471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.188402 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" event={"ID":"4bc57b8f-e457-482e-8c41-27fbb779b6a5","Type":"ContainerStarted","Data":"75a20985c3177d3296efc698f356ca0b319744bb8f6a9a56e2a71e226eaf2ab2"} Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.189738 5127 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-zc4gp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.189782 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.191208 5127 patch_prober.go:28] interesting pod/downloads-7954f5f757-8rw72 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.191488 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8rw72" podUID="27bef954-7005-4626-9049-195b48a9365b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.224033 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-hdpvr" podStartSLOduration=126.224010291 podStartE2EDuration="2m6.224010291s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.116989384 +0000 UTC m=+146.602891747" watchObservedRunningTime="2026-02-01 06:49:56.224010291 +0000 UTC m=+146.709912654" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.224562 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-m947v" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.229874 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.262999 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8gp5z" podStartSLOduration=126.262977876 podStartE2EDuration="2m6.262977876s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.250136815 +0000 UTC m=+146.736039188" watchObservedRunningTime="2026-02-01 06:49:56.262977876 +0000 UTC m=+146.748880239" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.267570 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.267904 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.767873303 +0000 UTC m=+147.253775666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.268422 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.271464 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.771448434 +0000 UTC m=+147.257350797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.286966 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nr4jk" podStartSLOduration=126.286943129 podStartE2EDuration="2m6.286943129s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.285964892 +0000 UTC m=+146.771867255" watchObservedRunningTime="2026-02-01 06:49:56.286943129 +0000 UTC m=+146.772845492" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.357905 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" podStartSLOduration=126.357881311 podStartE2EDuration="2m6.357881311s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.356928315 +0000 UTC m=+146.842830678" watchObservedRunningTime="2026-02-01 06:49:56.357881311 +0000 UTC m=+146.843783674" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.410499 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.411201 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:56.911179018 +0000 UTC m=+147.397081381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.430490 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" podStartSLOduration=126.43047348 podStartE2EDuration="2m6.43047348s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.429426451 +0000 UTC m=+146.915328814" watchObservedRunningTime="2026-02-01 06:49:56.43047348 +0000 UTC m=+146.916375843" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.446692 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" podStartSLOduration=126.446676006 podStartE2EDuration="2m6.446676006s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:56.44645059 +0000 UTC m=+146.932352943" watchObservedRunningTime="2026-02-01 06:49:56.446676006 +0000 UTC m=+146.932578369" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.518753 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.519030 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.019017588 +0000 UTC m=+147.504919951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.620033 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.620183 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.120152959 +0000 UTC m=+147.606055322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.620679 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.620980 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.120970892 +0000 UTC m=+147.606873255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.721751 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.722175 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.222131753 +0000 UTC m=+147.708034116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.788012 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:49:56 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:49:56 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:49:56 crc kubenswrapper[5127]: healthz check failed Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.788079 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.823280 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.823715 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.323698377 +0000 UTC m=+147.809600740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.924307 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.924488 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.424463777 +0000 UTC m=+147.910366140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:56 crc kubenswrapper[5127]: I0201 06:49:56.924600 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:56 crc kubenswrapper[5127]: E0201 06:49:56.924959 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.424952971 +0000 UTC m=+147.910855334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.025676 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.025908 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.525876456 +0000 UTC m=+148.011778819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.026150 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.026502 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.526490283 +0000 UTC m=+148.012392646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.127873 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.128088 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.628065356 +0000 UTC m=+148.113967719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.128333 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.128747 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.628730506 +0000 UTC m=+148.114632869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.195044 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" event={"ID":"98ec7bfa-b340-4f65-ada5-db427bd64a4e","Type":"ContainerStarted","Data":"e88c9e39e3372b3a0ce4096d62ed1a5393cf98375ae4881b4f4e71b07ae0afae"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.195160 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.196843 5127 generic.go:334] "Generic (PLEG): container finished" podID="25a3f596-1eb9-4bdb-afe9-902545ed5197" containerID="6ec5945a268bda76ce78ed15012b934d30eaf49dff1b01847272084240448067" exitCode=0 Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.196874 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" event={"ID":"25a3f596-1eb9-4bdb-afe9-902545ed5197","Type":"ContainerDied","Data":"6ec5945a268bda76ce78ed15012b934d30eaf49dff1b01847272084240448067"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.200285 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" event={"ID":"3e586b38-fbcb-4142-b1b0-46f7824f9cc5","Type":"ContainerStarted","Data":"3cc9ba11a454e135af3ad01374f424f26c53e04379f94511d7f1ac112c4d3b61"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.202707 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" event={"ID":"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45","Type":"ContainerStarted","Data":"61505d5331938a2bd2de1adeecacf653cd465760954858ea1331c1570789105b"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.202741 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" event={"ID":"b5cc97e6-27f2-4c5d-9cca-da2a5daa4b45","Type":"ContainerStarted","Data":"f6de52bdb92455d91025cabf7f228cbffc2b3464b943d5585862871a69ef2652"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.205538 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" event={"ID":"85e1eb9a-c828-43bc-9d44-4071fb0a1210","Type":"ContainerStarted","Data":"facb589ca3debfb31f6ba4d5520bc5abf570ab8fc6b8b1b30a3c97ab80fb3146"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.207643 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-z8hkn" event={"ID":"4bc57b8f-e457-482e-8c41-27fbb779b6a5","Type":"ContainerStarted","Data":"3a8d82de8b9b88c456774dd9309971a20185ed46bc70938d6750ee4d1c682460"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.209300 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" event={"ID":"339e4434-c20b-49ab-8a89-234c633c788b","Type":"ContainerStarted","Data":"bd5be9ff7702339e92100bba1508e28abd8a26011e1bff5cd11d62686619c5cf"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.216093 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5zh5x" event={"ID":"27eef381-a41a-4d53-bc8d-ed1cdc7b9109","Type":"ContainerStarted","Data":"9449ac1c684d62a4304ecd577155de3bf1d5a4971cf426aae1e14c3d25ce33d8"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.216355 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5zh5x" event={"ID":"27eef381-a41a-4d53-bc8d-ed1cdc7b9109","Type":"ContainerStarted","Data":"592693a121209bdd9b32bf25954f95f733903e8801fc95da9f0e48011911b931"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.216457 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5zh5x" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.221885 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" podStartSLOduration=127.221865222 podStartE2EDuration="2m7.221865222s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:57.22109641 +0000 UTC m=+147.706998773" watchObservedRunningTime="2026-02-01 06:49:57.221865222 +0000 UTC m=+147.707767575" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.224685 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" event={"ID":"abc25820-f936-4169-996c-d337ec58713b","Type":"ContainerStarted","Data":"400f35ba08fbd7c67a6d1a3b633a0a93ba74dcee527310bdb5e49fb07f2ce195"} Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.228964 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.229549 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.729522697 +0000 UTC m=+148.215425060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.234607 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.251275 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" podStartSLOduration=127.251256867 podStartE2EDuration="2m7.251256867s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:57.24708147 +0000 UTC m=+147.732983833" watchObservedRunningTime="2026-02-01 06:49:57.251256867 +0000 UTC m=+147.737159230" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.254321 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rqb54" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.255348 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-tm8zv" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.330602 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.334149 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.834137795 +0000 UTC m=+148.320040148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.350355 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" podStartSLOduration=127.35033712 podStartE2EDuration="2m7.35033712s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:57.346891273 +0000 UTC m=+147.832793646" watchObservedRunningTime="2026-02-01 06:49:57.35033712 +0000 UTC m=+147.836239493" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.379573 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-pmp6b" podStartSLOduration=127.379553801 podStartE2EDuration="2m7.379553801s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:57.378882032 +0000 UTC m=+147.864784405" watchObservedRunningTime="2026-02-01 06:49:57.379553801 +0000 UTC m=+147.865456154" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.407592 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5zh5x" podStartSLOduration=8.407561847 podStartE2EDuration="8.407561847s" podCreationTimestamp="2026-02-01 06:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:57.405670494 +0000 UTC m=+147.891572877" watchObservedRunningTime="2026-02-01 06:49:57.407561847 +0000 UTC m=+147.893464200" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.432184 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.432777 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:57.932731724 +0000 UTC m=+148.418634098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.440487 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.440543 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.442724 5127 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-fr6ms container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.442802 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" podUID="3e586b38-fbcb-4142-b1b0-46f7824f9cc5" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.533399 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.533797 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.033784333 +0000 UTC m=+148.519686696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.635657 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.635817 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.135791209 +0000 UTC m=+148.621693572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.636275 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.636865 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.136857569 +0000 UTC m=+148.622759932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.688861 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5fzh2" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.740098 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.740386 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.240370546 +0000 UTC m=+148.726272909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.793168 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:49:57 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:49:57 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:49:57 crc kubenswrapper[5127]: healthz check failed Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.793221 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.841227 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.841592 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.341564779 +0000 UTC m=+148.827467142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.942039 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.942176 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.442158485 +0000 UTC m=+148.928060848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.942599 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:57 crc kubenswrapper[5127]: E0201 06:49:57.942898 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.442884445 +0000 UTC m=+148.928786808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:57 crc kubenswrapper[5127]: I0201 06:49:57.971269 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jnckc" podStartSLOduration=127.971253422 podStartE2EDuration="2m7.971253422s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:57.964697118 +0000 UTC m=+148.450599481" watchObservedRunningTime="2026-02-01 06:49:57.971253422 +0000 UTC m=+148.457155785" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.043914 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.044104 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.544080187 +0000 UTC m=+149.029982550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.044358 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.044668 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.544660214 +0000 UTC m=+149.030562567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.145931 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.146138 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.646112914 +0000 UTC m=+149.132015277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.146236 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.146302 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.146338 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.146379 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.146414 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.146537 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.646525645 +0000 UTC m=+149.132428008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.147441 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.155844 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.157358 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.168902 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.176647 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.190486 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.238801 5127 generic.go:334] "Generic (PLEG): container finished" podID="b6c0fb18-dc93-4aed-abd0-55631d324b99" containerID="a428cffef45a764e039c470bf17d5dd774730f225654e331579ef58a51de2f36" exitCode=0 Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.245434 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" event={"ID":"b6c0fb18-dc93-4aed-abd0-55631d324b99","Type":"ContainerDied","Data":"a428cffef45a764e039c470bf17d5dd774730f225654e331579ef58a51de2f36"} Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.249096 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.249195 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.749177569 +0000 UTC m=+149.235079922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.249401 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.249681 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.749670403 +0000 UTC m=+149.235572766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.249808 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" event={"ID":"339e4434-c20b-49ab-8a89-234c633c788b","Type":"ContainerStarted","Data":"b24b5cf0101f9e464f1277036538e86d3d8242fae78ab3439fa6e75e8312951c"} Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.249843 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" event={"ID":"339e4434-c20b-49ab-8a89-234c633c788b","Type":"ContainerStarted","Data":"91db919f37d774b97c4665ffcddd92c1c9b83ef8e7d17ec187140e6788c2bb60"} Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.265637 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" event={"ID":"25a3f596-1eb9-4bdb-afe9-902545ed5197","Type":"ContainerStarted","Data":"bfcce2c81f710e4fb90dc8f47d8299e347e6bccf08e4f2c71d79aa17739a0180"} Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.304315 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" podStartSLOduration=128.304295037 podStartE2EDuration="2m8.304295037s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:58.302756554 +0000 UTC m=+148.788658917" watchObservedRunningTime="2026-02-01 06:49:58.304295037 +0000 UTC m=+148.790197420" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.350287 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.350467 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.850436174 +0000 UTC m=+149.336338537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.458626 5127 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.460419 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.463810 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.465186 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:58.965172777 +0000 UTC m=+149.451075140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.564859 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.565207 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.065192775 +0000 UTC m=+149.551095138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.584100 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6v4qm"] Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.584982 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.609623 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.612267 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v4qm"] Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.667664 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.674779 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.174755453 +0000 UTC m=+149.660657816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: W0201 06:49:58.765179 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-01ccef495414007b40838a59a388c4a535a4794d0cc7349b80f6cbbe2ad4955e WatchSource:0}: Error finding container 01ccef495414007b40838a59a388c4a535a4794d0cc7349b80f6cbbe2ad4955e: Status 404 returned error can't find the container with id 01ccef495414007b40838a59a388c4a535a4794d0cc7349b80f6cbbe2ad4955e Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.778721 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.778861 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.278840388 +0000 UTC m=+149.764742761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.778976 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvf7n\" (UniqueName: \"kubernetes.io/projected/ac3c627d-681a-4008-9bda-5e5f3af5aafd-kube-api-access-rvf7n\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.779023 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.779041 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-catalog-content\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.779058 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-utilities\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.779325 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.279318321 +0000 UTC m=+149.765220694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.794518 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rdtzv"] Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.795605 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.804679 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:49:58 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:49:58 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:49:58 crc kubenswrapper[5127]: healthz check failed Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.804742 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.804805 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.807132 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdtzv"] Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.885999 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.886210 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvf7n\" (UniqueName: \"kubernetes.io/projected/ac3c627d-681a-4008-9bda-5e5f3af5aafd-kube-api-access-rvf7n\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.886259 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-catalog-content\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.886281 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-utilities\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.886687 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.386664536 +0000 UTC m=+149.872566899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.886720 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-utilities\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.886968 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-catalog-content\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.924485 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvf7n\" (UniqueName: \"kubernetes.io/projected/ac3c627d-681a-4008-9bda-5e5f3af5aafd-kube-api-access-rvf7n\") pod \"community-operators-6v4qm\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.952718 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.954839 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.984742 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dzj87"] Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.990709 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-catalog-content\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.990773 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.990797 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnj4r\" (UniqueName: \"kubernetes.io/projected/3b3cb296-e043-482b-b6e0-50f0341eee73-kube-api-access-gnj4r\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.990834 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-utilities\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:58 crc kubenswrapper[5127]: E0201 06:49:58.991187 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.491173321 +0000 UTC m=+149.977075674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:58 crc kubenswrapper[5127]: I0201 06:49:58.991835 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.013995 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dzj87"] Feb 01 06:49:59 crc kubenswrapper[5127]: W0201 06:49:59.077536 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6e594f79d5ea57c6fad232e5b417f3bba32b83d7eacb0621be4a03303b7e33cd WatchSource:0}: Error finding container 6e594f79d5ea57c6fad232e5b417f3bba32b83d7eacb0621be4a03303b7e33cd: Status 404 returned error can't find the container with id 6e594f79d5ea57c6fad232e5b417f3bba32b83d7eacb0621be4a03303b7e33cd Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.105191 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.105424 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-catalog-content\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.105536 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnj4r\" (UniqueName: \"kubernetes.io/projected/3b3cb296-e043-482b-b6e0-50f0341eee73-kube-api-access-gnj4r\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.105593 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-utilities\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:59 crc kubenswrapper[5127]: E0201 06:49:59.105768 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.60574412 +0000 UTC m=+150.091646483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.106018 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-catalog-content\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.115742 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-utilities\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.136107 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnj4r\" (UniqueName: \"kubernetes.io/projected/3b3cb296-e043-482b-b6e0-50f0341eee73-kube-api-access-gnj4r\") pod \"certified-operators-rdtzv\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.172866 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.189971 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x4b5r"] Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.197683 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.206644 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4b5r"] Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.207488 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.207533 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-utilities\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.207559 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-utilities\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.207592 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-catalog-content\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.207616 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-catalog-content\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.207658 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn4fr\" (UniqueName: \"kubernetes.io/projected/6dd3d1fb-13f1-452e-afa2-580c6d736be3-kube-api-access-tn4fr\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.207673 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qxpv\" (UniqueName: \"kubernetes.io/projected/40d7e137-a822-4ed9-b1cf-a123d53e4122-kube-api-access-5qxpv\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: E0201 06:49:59.207932 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.707919791 +0000 UTC m=+150.193822154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.254047 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6v4qm"] Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.278553 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" event={"ID":"339e4434-c20b-49ab-8a89-234c633c788b","Type":"ContainerStarted","Data":"1481939c3d1fee3af4d11cfad430814372b053598d01e3d31f07062793e8a043"} Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.310055 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.310270 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-utilities\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.310302 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-catalog-content\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.310339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-catalog-content\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.310409 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn4fr\" (UniqueName: \"kubernetes.io/projected/6dd3d1fb-13f1-452e-afa2-580c6d736be3-kube-api-access-tn4fr\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.310428 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qxpv\" (UniqueName: \"kubernetes.io/projected/40d7e137-a822-4ed9-b1cf-a123d53e4122-kube-api-access-5qxpv\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.310528 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-utilities\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: E0201 06:49:59.311163 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.81112853 +0000 UTC m=+150.297030893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.311596 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-catalog-content\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.311697 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-utilities\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.311879 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-catalog-content\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.312080 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6e594f79d5ea57c6fad232e5b417f3bba32b83d7eacb0621be4a03303b7e33cd"} Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.319066 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rgmvx" podStartSLOduration=10.319049782 podStartE2EDuration="10.319049782s" podCreationTimestamp="2026-02-01 06:49:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:49:59.318468056 +0000 UTC m=+149.804370419" watchObservedRunningTime="2026-02-01 06:49:59.319049782 +0000 UTC m=+149.804952145" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.322918 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-utilities\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.357348 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn4fr\" (UniqueName: \"kubernetes.io/projected/6dd3d1fb-13f1-452e-afa2-580c6d736be3-kube-api-access-tn4fr\") pod \"community-operators-dzj87\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.360950 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qxpv\" (UniqueName: \"kubernetes.io/projected/40d7e137-a822-4ed9-b1cf-a123d53e4122-kube-api-access-5qxpv\") pod \"certified-operators-x4b5r\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.368176 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"796f98fea8e0ae84bddd06df232ef173e9572dc24792a6c70fb69b8eb3ec495b"} Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.368271 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"01ccef495414007b40838a59a388c4a535a4794d0cc7349b80f6cbbe2ad4955e"} Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.369428 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.402724 5127 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-01T06:49:58.458871269Z","Handler":null,"Name":""} Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.411093 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:59 crc kubenswrapper[5127]: E0201 06:49:59.412482 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 06:49:59.912464636 +0000 UTC m=+150.398366999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-m8jnf" (UID: "f90c88e9-7849-4ec4-9df3-311426864686") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.437405 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a2dee03bd13fbe86cf34d22e3eef7220120cb0deb6601713be6c27c9435f453a"} Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.437436 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b93f6fda1ce7091f0a5bebea215fbb39294be6c83a91344e527b04fa1fbd6d19"} Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.511998 5127 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.512369 5127 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.516798 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.539776 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.547079 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.591141 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdtzv"] Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.618499 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.642851 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.647948 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.648004 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.701705 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-m8jnf\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.780963 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.794874 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:49:59 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:49:59 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:49:59 crc kubenswrapper[5127]: healthz check failed Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.794923 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.823513 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6c0fb18-dc93-4aed-abd0-55631d324b99-secret-volume\") pod \"b6c0fb18-dc93-4aed-abd0-55631d324b99\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.823799 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7n9r\" (UniqueName: \"kubernetes.io/projected/b6c0fb18-dc93-4aed-abd0-55631d324b99-kube-api-access-m7n9r\") pod \"b6c0fb18-dc93-4aed-abd0-55631d324b99\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.823829 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6c0fb18-dc93-4aed-abd0-55631d324b99-config-volume\") pod \"b6c0fb18-dc93-4aed-abd0-55631d324b99\" (UID: \"b6c0fb18-dc93-4aed-abd0-55631d324b99\") " Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.828080 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c0fb18-dc93-4aed-abd0-55631d324b99-config-volume" (OuterVolumeSpecName: "config-volume") pod "b6c0fb18-dc93-4aed-abd0-55631d324b99" (UID: "b6c0fb18-dc93-4aed-abd0-55631d324b99"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.835065 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c0fb18-dc93-4aed-abd0-55631d324b99-kube-api-access-m7n9r" (OuterVolumeSpecName: "kube-api-access-m7n9r") pod "b6c0fb18-dc93-4aed-abd0-55631d324b99" (UID: "b6c0fb18-dc93-4aed-abd0-55631d324b99"). InnerVolumeSpecName "kube-api-access-m7n9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.835680 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c0fb18-dc93-4aed-abd0-55631d324b99-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b6c0fb18-dc93-4aed-abd0-55631d324b99" (UID: "b6c0fb18-dc93-4aed-abd0-55631d324b99"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.863806 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.925593 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b6c0fb18-dc93-4aed-abd0-55631d324b99-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.925621 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7n9r\" (UniqueName: \"kubernetes.io/projected/b6c0fb18-dc93-4aed-abd0-55631d324b99-kube-api-access-m7n9r\") on node \"crc\" DevicePath \"\"" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.925630 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b6c0fb18-dc93-4aed-abd0-55631d324b99-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 06:49:59 crc kubenswrapper[5127]: I0201 06:49:59.965272 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4b5r"] Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.121984 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dzj87"] Feb 01 06:50:00 crc kubenswrapper[5127]: W0201 06:50:00.131819 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd3d1fb_13f1_452e_afa2_580c6d736be3.slice/crio-0d4959dd54f527301ca9d76447a8829d6de382441c13dcb2c8685fbcd0df03b0 WatchSource:0}: Error finding container 0d4959dd54f527301ca9d76447a8829d6de382441c13dcb2c8685fbcd0df03b0: Status 404 returned error can't find the container with id 0d4959dd54f527301ca9d76447a8829d6de382441c13dcb2c8685fbcd0df03b0 Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.243202 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.259847 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8jnf"] Feb 01 06:50:00 crc kubenswrapper[5127]: W0201 06:50:00.295595 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf90c88e9_7849_4ec4_9df3_311426864686.slice/crio-c79cc141021a8ce9da602a954994824863a0ff0481ada0d05ac8ef255ab01afd WatchSource:0}: Error finding container c79cc141021a8ce9da602a954994824863a0ff0481ada0d05ac8ef255ab01afd: Status 404 returned error can't find the container with id c79cc141021a8ce9da602a954994824863a0ff0481ada0d05ac8ef255ab01afd Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.462832 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" event={"ID":"f90c88e9-7849-4ec4-9df3-311426864686","Type":"ContainerStarted","Data":"d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.462916 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" event={"ID":"f90c88e9-7849-4ec4-9df3-311426864686","Type":"ContainerStarted","Data":"c79cc141021a8ce9da602a954994824863a0ff0481ada0d05ac8ef255ab01afd"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.463914 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.469769 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2b057f7bd032873816a52cc98695c92b17f394cac680609b9cd3f638d6199205"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.474193 5127 generic.go:334] "Generic (PLEG): container finished" podID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerID="095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d" exitCode=0 Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.474258 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4b5r" event={"ID":"40d7e137-a822-4ed9-b1cf-a123d53e4122","Type":"ContainerDied","Data":"095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.474283 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4b5r" event={"ID":"40d7e137-a822-4ed9-b1cf-a123d53e4122","Type":"ContainerStarted","Data":"c65ccf9628a04f47c291e8c03b4e4e77476bc37914b356e46721887c71b6a292"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.477073 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.479520 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" event={"ID":"b6c0fb18-dc93-4aed-abd0-55631d324b99","Type":"ContainerDied","Data":"5ad82ca012798110fdd6f372d70c6b940d86a91b49c8c927aadcdf90c15736be"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.480480 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad82ca012798110fdd6f372d70c6b940d86a91b49c8c927aadcdf90c15736be" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.479824 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.484432 5127 generic.go:334] "Generic (PLEG): container finished" podID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerID="3ec61d217380b97a3274e891b06e3d12fe035aef8ba92fd3ef037a365acf64e0" exitCode=0 Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.484522 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v4qm" event={"ID":"ac3c627d-681a-4008-9bda-5e5f3af5aafd","Type":"ContainerDied","Data":"3ec61d217380b97a3274e891b06e3d12fe035aef8ba92fd3ef037a365acf64e0"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.484552 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v4qm" event={"ID":"ac3c627d-681a-4008-9bda-5e5f3af5aafd","Type":"ContainerStarted","Data":"dc4805787ae40f37b2f33fc5cde7415976f0606675e6be59e2bd9453d120119f"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.489743 5127 generic.go:334] "Generic (PLEG): container finished" podID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerID="d574a92e6ca46c70f085afe245abffec4e912391f5e64bb7f571ec48168fe2fd" exitCode=0 Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.489883 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdtzv" event={"ID":"3b3cb296-e043-482b-b6e0-50f0341eee73","Type":"ContainerDied","Data":"d574a92e6ca46c70f085afe245abffec4e912391f5e64bb7f571ec48168fe2fd"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.489931 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdtzv" event={"ID":"3b3cb296-e043-482b-b6e0-50f0341eee73","Type":"ContainerStarted","Data":"9a4196927b5a9003ba85bd2d46ddcae376fdbacbddc72472b131b734c270a43e"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.499760 5127 generic.go:334] "Generic (PLEG): container finished" podID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerID="0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1" exitCode=0 Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.501301 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzj87" event={"ID":"6dd3d1fb-13f1-452e-afa2-580c6d736be3","Type":"ContainerDied","Data":"0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.501336 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzj87" event={"ID":"6dd3d1fb-13f1-452e-afa2-580c6d736be3","Type":"ContainerStarted","Data":"0d4959dd54f527301ca9d76447a8829d6de382441c13dcb2c8685fbcd0df03b0"} Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.512861 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qxg44" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.513459 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" podStartSLOduration=130.513449363 podStartE2EDuration="2m10.513449363s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:00.488452341 +0000 UTC m=+150.974354704" watchObservedRunningTime="2026-02-01 06:50:00.513449363 +0000 UTC m=+150.999351716" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.694097 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 06:50:00 crc kubenswrapper[5127]: E0201 06:50:00.694385 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c0fb18-dc93-4aed-abd0-55631d324b99" containerName="collect-profiles" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.694405 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c0fb18-dc93-4aed-abd0-55631d324b99" containerName="collect-profiles" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.694539 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c0fb18-dc93-4aed-abd0-55631d324b99" containerName="collect-profiles" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.695138 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.697326 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.697415 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.703898 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.762833 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bxqsq"] Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.763809 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.766264 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.778975 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxqsq"] Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.787167 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:50:00 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:50:00 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:50:00 crc kubenswrapper[5127]: healthz check failed Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.787231 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.849918 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ea068af-8596-49c2-a8fa-a782970f8103-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.849989 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ea068af-8596-49c2-a8fa-a782970f8103-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.951324 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-utilities\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.951393 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ea068af-8596-49c2-a8fa-a782970f8103-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.951421 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pth5\" (UniqueName: \"kubernetes.io/projected/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-kube-api-access-2pth5\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.951484 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-catalog-content\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.951513 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ea068af-8596-49c2-a8fa-a782970f8103-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.951620 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ea068af-8596-49c2-a8fa-a782970f8103-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:00 crc kubenswrapper[5127]: I0201 06:50:00.991188 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ea068af-8596-49c2-a8fa-a782970f8103-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.009983 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.052335 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-catalog-content\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.052437 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-utilities\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.052469 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pth5\" (UniqueName: \"kubernetes.io/projected/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-kube-api-access-2pth5\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.053317 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-utilities\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.053379 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-catalog-content\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.074318 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pth5\" (UniqueName: \"kubernetes.io/projected/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-kube-api-access-2pth5\") pod \"redhat-marketplace-bxqsq\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.075916 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.177768 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fxlfq"] Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.179352 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.182500 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxlfq"] Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.302161 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.360374 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-catalog-content\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.360444 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-utilities\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.360524 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95lhk\" (UniqueName: \"kubernetes.io/projected/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-kube-api-access-95lhk\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.392260 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxqsq"] Feb 01 06:50:01 crc kubenswrapper[5127]: W0201 06:50:01.415345 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ea38b4f_0889_4c3c_bdfc_7afa8a41eb30.slice/crio-f8500056b2318a657bf1ad2f77579ec0f8b8032b4e8638adf474218bd90769c2 WatchSource:0}: Error finding container f8500056b2318a657bf1ad2f77579ec0f8b8032b4e8638adf474218bd90769c2: Status 404 returned error can't find the container with id f8500056b2318a657bf1ad2f77579ec0f8b8032b4e8638adf474218bd90769c2 Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.462254 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95lhk\" (UniqueName: \"kubernetes.io/projected/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-kube-api-access-95lhk\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.462324 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-catalog-content\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.462355 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-utilities\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.462991 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-utilities\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.463038 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-catalog-content\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.491277 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95lhk\" (UniqueName: \"kubernetes.io/projected/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-kube-api-access-95lhk\") pod \"redhat-marketplace-fxlfq\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.530204 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxqsq" event={"ID":"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30","Type":"ContainerStarted","Data":"f8500056b2318a657bf1ad2f77579ec0f8b8032b4e8638adf474218bd90769c2"} Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.540826 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ea068af-8596-49c2-a8fa-a782970f8103","Type":"ContainerStarted","Data":"0a69fc1d41900f66a8bb45b1f281f9e7dab5f6b16852ba940884f9ac41aa8a03"} Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.556155 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.770316 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jjscz"] Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.771689 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.776258 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.784030 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjscz"] Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.803832 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:50:01 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:50:01 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:50:01 crc kubenswrapper[5127]: healthz check failed Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.803877 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.864901 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxlfq"] Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.877079 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-utilities\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.877143 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bk6x\" (UniqueName: \"kubernetes.io/projected/02d7fc8d-87e8-455b-9f99-fde65167beea-kube-api-access-2bk6x\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.877181 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-catalog-content\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: W0201 06:50:01.883543 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0285e4e_ca44_40ee_aaad_4c2bef41ce24.slice/crio-60a56b4fdabaa2b15e68fa9b6ba0e7e0fccc9b39a50b1e23a4059deb1e83a059 WatchSource:0}: Error finding container 60a56b4fdabaa2b15e68fa9b6ba0e7e0fccc9b39a50b1e23a4059deb1e83a059: Status 404 returned error can't find the container with id 60a56b4fdabaa2b15e68fa9b6ba0e7e0fccc9b39a50b1e23a4059deb1e83a059 Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.978908 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bk6x\" (UniqueName: \"kubernetes.io/projected/02d7fc8d-87e8-455b-9f99-fde65167beea-kube-api-access-2bk6x\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.978959 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-catalog-content\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.979069 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-utilities\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.979670 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-utilities\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:01 crc kubenswrapper[5127]: I0201 06:50:01.980323 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-catalog-content\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.005350 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bk6x\" (UniqueName: \"kubernetes.io/projected/02d7fc8d-87e8-455b-9f99-fde65167beea-kube-api-access-2bk6x\") pod \"redhat-operators-jjscz\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.022779 5127 patch_prober.go:28] interesting pod/downloads-7954f5f757-8rw72 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.022866 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-8rw72" podUID="27bef954-7005-4626-9049-195b48a9365b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.023241 5127 patch_prober.go:28] interesting pod/downloads-7954f5f757-8rw72 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.023300 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-8rw72" podUID="27bef954-7005-4626-9049-195b48a9365b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.122666 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.130923 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.130979 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.140173 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.170782 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l5p2j"] Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.171727 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.177867 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5p2j"] Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.277114 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.278171 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.281222 5127 patch_prober.go:28] interesting pod/console-f9d7485db-zfpgn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.281284 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-zfpgn" podUID="99bc3500-5fb6-4d26-97dd-24dc06658294" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.282290 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84qgc\" (UniqueName: \"kubernetes.io/projected/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-kube-api-access-84qgc\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.282338 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-catalog-content\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.282371 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-utilities\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.389720 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84qgc\" (UniqueName: \"kubernetes.io/projected/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-kube-api-access-84qgc\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.390102 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-utilities\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.390136 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-catalog-content\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.391681 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-utilities\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.391918 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-catalog-content\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.433961 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84qgc\" (UniqueName: \"kubernetes.io/projected/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-kube-api-access-84qgc\") pod \"redhat-operators-l5p2j\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.446821 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.451243 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fr6ms" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.542813 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.554987 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.629673 5127 generic.go:334] "Generic (PLEG): container finished" podID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerID="59bfed2c64776ad46b29c69064635dcd61034fc1749587144db6149b6430b3d6" exitCode=0 Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.629779 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxqsq" event={"ID":"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30","Type":"ContainerDied","Data":"59bfed2c64776ad46b29c69064635dcd61034fc1749587144db6149b6430b3d6"} Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.634095 5127 generic.go:334] "Generic (PLEG): container finished" podID="0ea068af-8596-49c2-a8fa-a782970f8103" containerID="70ffa062bfa74f55c7683b1e4f7bf00bd85666bea6d7ca2c3c2d223d2bd4815d" exitCode=0 Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.634144 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ea068af-8596-49c2-a8fa-a782970f8103","Type":"ContainerDied","Data":"70ffa062bfa74f55c7683b1e4f7bf00bd85666bea6d7ca2c3c2d223d2bd4815d"} Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.636772 5127 generic.go:334] "Generic (PLEG): container finished" podID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerID="dd57f9a6801dcbd62391abf651a039319f17cad6921122a1645808661a6a8695" exitCode=0 Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.638262 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxlfq" event={"ID":"e0285e4e-ca44-40ee-aaad-4c2bef41ce24","Type":"ContainerDied","Data":"dd57f9a6801dcbd62391abf651a039319f17cad6921122a1645808661a6a8695"} Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.638279 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxlfq" event={"ID":"e0285e4e-ca44-40ee-aaad-4c2bef41ce24","Type":"ContainerStarted","Data":"60a56b4fdabaa2b15e68fa9b6ba0e7e0fccc9b39a50b1e23a4059deb1e83a059"} Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.646807 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-w4bsb" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.648020 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjscz"] Feb 01 06:50:02 crc kubenswrapper[5127]: W0201 06:50:02.693874 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d7fc8d_87e8_455b_9f99_fde65167beea.slice/crio-d7491f340ba64907c775029744610676e217121c6c795de4be2edd3d7000b5e0 WatchSource:0}: Error finding container d7491f340ba64907c775029744610676e217121c6c795de4be2edd3d7000b5e0: Status 404 returned error can't find the container with id d7491f340ba64907c775029744610676e217121c6c795de4be2edd3d7000b5e0 Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.788287 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.791528 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:50:02 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:50:02 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:50:02 crc kubenswrapper[5127]: healthz check failed Feb 01 06:50:02 crc kubenswrapper[5127]: I0201 06:50:02.791562 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.038779 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l5p2j"] Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.646293 5127 generic.go:334] "Generic (PLEG): container finished" podID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerID="e5958004707a41d32faab7b32042485f82253776f5fc7d7ca5ec44cc07e41be8" exitCode=0 Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.646882 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjscz" event={"ID":"02d7fc8d-87e8-455b-9f99-fde65167beea","Type":"ContainerDied","Data":"e5958004707a41d32faab7b32042485f82253776f5fc7d7ca5ec44cc07e41be8"} Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.646912 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjscz" event={"ID":"02d7fc8d-87e8-455b-9f99-fde65167beea","Type":"ContainerStarted","Data":"d7491f340ba64907c775029744610676e217121c6c795de4be2edd3d7000b5e0"} Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.652116 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerID="0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d" exitCode=0 Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.652296 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5p2j" event={"ID":"1c2c6b95-9c35-4fa1-be58-4e825fd86e97","Type":"ContainerDied","Data":"0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d"} Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.652331 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5p2j" event={"ID":"1c2c6b95-9c35-4fa1-be58-4e825fd86e97","Type":"ContainerStarted","Data":"9a08bd0d3a7dd927bfb3d7f09063299709fc4c50a4723a2245b2d98516a94e00"} Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.786653 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:50:03 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:50:03 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:50:03 crc kubenswrapper[5127]: healthz check failed Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.786704 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.789056 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.789786 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.791636 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.794343 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.802615 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.917547 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5468bc88-58bd-4857-901d-07c9f917dbf0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.917642 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5468bc88-58bd-4857-901d-07c9f917dbf0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:03 crc kubenswrapper[5127]: I0201 06:50:03.942988 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.018540 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ea068af-8596-49c2-a8fa-a782970f8103-kubelet-dir\") pod \"0ea068af-8596-49c2-a8fa-a782970f8103\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.018635 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ea068af-8596-49c2-a8fa-a782970f8103-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0ea068af-8596-49c2-a8fa-a782970f8103" (UID: "0ea068af-8596-49c2-a8fa-a782970f8103"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.018693 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ea068af-8596-49c2-a8fa-a782970f8103-kube-api-access\") pod \"0ea068af-8596-49c2-a8fa-a782970f8103\" (UID: \"0ea068af-8596-49c2-a8fa-a782970f8103\") " Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.018924 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5468bc88-58bd-4857-901d-07c9f917dbf0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.018988 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5468bc88-58bd-4857-901d-07c9f917dbf0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.019032 5127 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0ea068af-8596-49c2-a8fa-a782970f8103-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.019249 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5468bc88-58bd-4857-901d-07c9f917dbf0-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.023383 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea068af-8596-49c2-a8fa-a782970f8103-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0ea068af-8596-49c2-a8fa-a782970f8103" (UID: "0ea068af-8596-49c2-a8fa-a782970f8103"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.040705 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5468bc88-58bd-4857-901d-07c9f917dbf0-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.120809 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0ea068af-8596-49c2-a8fa-a782970f8103-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.128841 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.496792 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 06:50:04 crc kubenswrapper[5127]: W0201 06:50:04.574056 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5468bc88_58bd_4857_901d_07c9f917dbf0.slice/crio-ecbeb1363b6c0901d1be56a29ff7f13b2438a5f0908df637a71daf6fb72fb532 WatchSource:0}: Error finding container ecbeb1363b6c0901d1be56a29ff7f13b2438a5f0908df637a71daf6fb72fb532: Status 404 returned error can't find the container with id ecbeb1363b6c0901d1be56a29ff7f13b2438a5f0908df637a71daf6fb72fb532 Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.692654 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0ea068af-8596-49c2-a8fa-a782970f8103","Type":"ContainerDied","Data":"0a69fc1d41900f66a8bb45b1f281f9e7dab5f6b16852ba940884f9ac41aa8a03"} Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.692881 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a69fc1d41900f66a8bb45b1f281f9e7dab5f6b16852ba940884f9ac41aa8a03" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.692735 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.696087 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5468bc88-58bd-4857-901d-07c9f917dbf0","Type":"ContainerStarted","Data":"ecbeb1363b6c0901d1be56a29ff7f13b2438a5f0908df637a71daf6fb72fb532"} Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.786867 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:50:04 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:50:04 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:50:04 crc kubenswrapper[5127]: healthz check failed Feb 01 06:50:04 crc kubenswrapper[5127]: I0201 06:50:04.786930 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:50:05 crc kubenswrapper[5127]: I0201 06:50:05.749803 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5468bc88-58bd-4857-901d-07c9f917dbf0","Type":"ContainerStarted","Data":"e9d37e74a2ba5a6fb9d0181813f0d061dcde5e109957148c5b3037d0da9f75d8"} Feb 01 06:50:05 crc kubenswrapper[5127]: I0201 06:50:05.774406 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.774391465 podStartE2EDuration="2.774391465s" podCreationTimestamp="2026-02-01 06:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:05.772405589 +0000 UTC m=+156.258307952" watchObservedRunningTime="2026-02-01 06:50:05.774391465 +0000 UTC m=+156.260293828" Feb 01 06:50:05 crc kubenswrapper[5127]: I0201 06:50:05.787094 5127 patch_prober.go:28] interesting pod/router-default-5444994796-z7mc2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 06:50:05 crc kubenswrapper[5127]: [-]has-synced failed: reason withheld Feb 01 06:50:05 crc kubenswrapper[5127]: [+]process-running ok Feb 01 06:50:05 crc kubenswrapper[5127]: healthz check failed Feb 01 06:50:05 crc kubenswrapper[5127]: I0201 06:50:05.787142 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-z7mc2" podUID="af3eca47-a30a-4d71-812b-01b5719b08e9" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 06:50:06 crc kubenswrapper[5127]: I0201 06:50:06.748089 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:50:06 crc kubenswrapper[5127]: I0201 06:50:06.748139 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:50:06 crc kubenswrapper[5127]: I0201 06:50:06.766745 5127 generic.go:334] "Generic (PLEG): container finished" podID="5468bc88-58bd-4857-901d-07c9f917dbf0" containerID="e9d37e74a2ba5a6fb9d0181813f0d061dcde5e109957148c5b3037d0da9f75d8" exitCode=0 Feb 01 06:50:06 crc kubenswrapper[5127]: I0201 06:50:06.766783 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5468bc88-58bd-4857-901d-07c9f917dbf0","Type":"ContainerDied","Data":"e9d37e74a2ba5a6fb9d0181813f0d061dcde5e109957148c5b3037d0da9f75d8"} Feb 01 06:50:06 crc kubenswrapper[5127]: I0201 06:50:06.789469 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:50:06 crc kubenswrapper[5127]: I0201 06:50:06.792296 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-z7mc2" Feb 01 06:50:07 crc kubenswrapper[5127]: I0201 06:50:07.895813 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5zh5x" Feb 01 06:50:12 crc kubenswrapper[5127]: I0201 06:50:12.026978 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-8rw72" Feb 01 06:50:12 crc kubenswrapper[5127]: I0201 06:50:12.274859 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:50:12 crc kubenswrapper[5127]: I0201 06:50:12.278126 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 06:50:12 crc kubenswrapper[5127]: I0201 06:50:12.894373 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:50:12 crc kubenswrapper[5127]: I0201 06:50:12.915338 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bafc814f-6c41-40cf-b3f4-8babc6ec840a-metrics-certs\") pod \"network-metrics-daemon-ls5xc\" (UID: \"bafc814f-6c41-40cf-b3f4-8babc6ec840a\") " pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:50:13 crc kubenswrapper[5127]: I0201 06:50:13.057887 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ls5xc" Feb 01 06:50:17 crc kubenswrapper[5127]: I0201 06:50:17.267577 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwdsw"] Feb 01 06:50:17 crc kubenswrapper[5127]: I0201 06:50:17.268236 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerName="controller-manager" containerID="cri-o://34f6cce712a98ed1bf252839d852b104a9251477ae9239a79818d89dfcd5d94a" gracePeriod=30 Feb 01 06:50:17 crc kubenswrapper[5127]: I0201 06:50:17.301933 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq"] Feb 01 06:50:17 crc kubenswrapper[5127]: I0201 06:50:17.302275 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" podUID="06682504-c11d-41d1-838a-a336640770a8" containerName="route-controller-manager" containerID="cri-o://344e8cf1b56504e4f618103bc3bad4ebd1f6fc58d6453612a2b7c7203fb31887" gracePeriod=30 Feb 01 06:50:17 crc kubenswrapper[5127]: I0201 06:50:17.847767 5127 generic.go:334] "Generic (PLEG): container finished" podID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerID="34f6cce712a98ed1bf252839d852b104a9251477ae9239a79818d89dfcd5d94a" exitCode=0 Feb 01 06:50:17 crc kubenswrapper[5127]: I0201 06:50:17.847826 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" event={"ID":"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19","Type":"ContainerDied","Data":"34f6cce712a98ed1bf252839d852b104a9251477ae9239a79818d89dfcd5d94a"} Feb 01 06:50:19 crc kubenswrapper[5127]: I0201 06:50:19.881820 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:50:21 crc kubenswrapper[5127]: I0201 06:50:21.924304 5127 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mpbrq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 01 06:50:21 crc kubenswrapper[5127]: I0201 06:50:21.926828 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" podUID="06682504-c11d-41d1-838a-a336640770a8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 01 06:50:22 crc kubenswrapper[5127]: I0201 06:50:22.940645 5127 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mwdsw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 01 06:50:22 crc kubenswrapper[5127]: I0201 06:50:22.940725 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.358139 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.509454 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5468bc88-58bd-4857-901d-07c9f917dbf0-kubelet-dir\") pod \"5468bc88-58bd-4857-901d-07c9f917dbf0\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.509731 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5468bc88-58bd-4857-901d-07c9f917dbf0-kube-api-access\") pod \"5468bc88-58bd-4857-901d-07c9f917dbf0\" (UID: \"5468bc88-58bd-4857-901d-07c9f917dbf0\") " Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.510049 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5468bc88-58bd-4857-901d-07c9f917dbf0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5468bc88-58bd-4857-901d-07c9f917dbf0" (UID: "5468bc88-58bd-4857-901d-07c9f917dbf0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.510212 5127 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5468bc88-58bd-4857-901d-07c9f917dbf0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.518824 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5468bc88-58bd-4857-901d-07c9f917dbf0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5468bc88-58bd-4857-901d-07c9f917dbf0" (UID: "5468bc88-58bd-4857-901d-07c9f917dbf0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.614326 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5468bc88-58bd-4857-901d-07c9f917dbf0-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.922993 5127 generic.go:334] "Generic (PLEG): container finished" podID="06682504-c11d-41d1-838a-a336640770a8" containerID="344e8cf1b56504e4f618103bc3bad4ebd1f6fc58d6453612a2b7c7203fb31887" exitCode=0 Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.923078 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" event={"ID":"06682504-c11d-41d1-838a-a336640770a8","Type":"ContainerDied","Data":"344e8cf1b56504e4f618103bc3bad4ebd1f6fc58d6453612a2b7c7203fb31887"} Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.925481 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5468bc88-58bd-4857-901d-07c9f917dbf0","Type":"ContainerDied","Data":"ecbeb1363b6c0901d1be56a29ff7f13b2438a5f0908df637a71daf6fb72fb532"} Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.925515 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 06:50:27 crc kubenswrapper[5127]: I0201 06:50:27.925527 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecbeb1363b6c0901d1be56a29ff7f13b2438a5f0908df637a71daf6fb72fb532" Feb 01 06:50:31 crc kubenswrapper[5127]: E0201 06:50:31.767486 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 01 06:50:31 crc kubenswrapper[5127]: E0201 06:50:31.768257 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn4fr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dzj87_openshift-marketplace(6dd3d1fb-13f1-452e-afa2-580c6d736be3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:50:31 crc kubenswrapper[5127]: E0201 06:50:31.769433 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dzj87" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" Feb 01 06:50:32 crc kubenswrapper[5127]: I0201 06:50:32.835878 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-lc9tz" Feb 01 06:50:32 crc kubenswrapper[5127]: I0201 06:50:32.924115 5127 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mpbrq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 01 06:50:32 crc kubenswrapper[5127]: I0201 06:50:32.924199 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" podUID="06682504-c11d-41d1-838a-a336640770a8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 01 06:50:33 crc kubenswrapper[5127]: E0201 06:50:33.013636 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dzj87" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" Feb 01 06:50:33 crc kubenswrapper[5127]: I0201 06:50:33.940089 5127 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mwdsw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 01 06:50:33 crc kubenswrapper[5127]: I0201 06:50:33.940482 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 01 06:50:36 crc kubenswrapper[5127]: E0201 06:50:36.457293 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 01 06:50:36 crc kubenswrapper[5127]: E0201 06:50:36.457957 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bk6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jjscz_openshift-marketplace(02d7fc8d-87e8-455b-9f99-fde65167beea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:50:36 crc kubenswrapper[5127]: E0201 06:50:36.460287 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jjscz" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" Feb 01 06:50:36 crc kubenswrapper[5127]: I0201 06:50:36.740636 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:50:36 crc kubenswrapper[5127]: I0201 06:50:36.740702 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.737333 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jjscz" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.801931 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.812128 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.817959 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.818114 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5qxpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-x4b5r_openshift-marketplace(40d7e137-a822-4ed9-b1cf-a123d53e4122): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.820677 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-x4b5r" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831464 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp"] Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.831708 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea068af-8596-49c2-a8fa-a782970f8103" containerName="pruner" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831723 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea068af-8596-49c2-a8fa-a782970f8103" containerName="pruner" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.831739 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerName="controller-manager" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831747 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerName="controller-manager" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.831761 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06682504-c11d-41d1-838a-a336640770a8" containerName="route-controller-manager" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831770 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="06682504-c11d-41d1-838a-a336640770a8" containerName="route-controller-manager" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.831785 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5468bc88-58bd-4857-901d-07c9f917dbf0" containerName="pruner" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831792 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5468bc88-58bd-4857-901d-07c9f917dbf0" containerName="pruner" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831878 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5468bc88-58bd-4857-901d-07c9f917dbf0" containerName="pruner" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831888 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" containerName="controller-manager" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831894 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="06682504-c11d-41d1-838a-a336640770a8" containerName="route-controller-manager" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.831907 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea068af-8596-49c2-a8fa-a782970f8103" containerName="pruner" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.832242 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.838340 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.838484 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-95lhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fxlfq_openshift-marketplace(e0285e4e-ca44-40ee-aaad-4c2bef41ce24): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.839948 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fxlfq" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.842734 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp"] Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.865430 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.865580 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rvf7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6v4qm_openshift-marketplace(ac3c627d-681a-4008-9bda-5e5f3af5aafd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.867391 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6v4qm" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.884901 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.885023 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2pth5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bxqsq_openshift-marketplace(8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.886168 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bxqsq" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.887882 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.888062 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84qgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-l5p2j_openshift-marketplace(1c2c6b95-9c35-4fa1-be58-4e825fd86e97): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.889222 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-l5p2j" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956020 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-serving-cert\") pod \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956074 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-client-ca\") pod \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956121 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-config\") pod \"06682504-c11d-41d1-838a-a336640770a8\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956156 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkr67\" (UniqueName: \"kubernetes.io/projected/06682504-c11d-41d1-838a-a336640770a8-kube-api-access-nkr67\") pod \"06682504-c11d-41d1-838a-a336640770a8\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956203 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghnrs\" (UniqueName: \"kubernetes.io/projected/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-kube-api-access-ghnrs\") pod \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956230 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-proxy-ca-bundles\") pod \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956249 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-client-ca\") pod \"06682504-c11d-41d1-838a-a336640770a8\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956279 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06682504-c11d-41d1-838a-a336640770a8-serving-cert\") pod \"06682504-c11d-41d1-838a-a336640770a8\" (UID: \"06682504-c11d-41d1-838a-a336640770a8\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956294 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-config\") pod \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\" (UID: \"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19\") " Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956425 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94519f4-8cda-4d17-a851-bd4ff661f98b-serving-cert\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956456 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-config\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956514 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf6sn\" (UniqueName: \"kubernetes.io/projected/c94519f4-8cda-4d17-a851-bd4ff661f98b-kube-api-access-nf6sn\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.956554 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-client-ca\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.957182 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-config" (OuterVolumeSpecName: "config") pod "06682504-c11d-41d1-838a-a336640770a8" (UID: "06682504-c11d-41d1-838a-a336640770a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.957213 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-client-ca" (OuterVolumeSpecName: "client-ca") pod "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" (UID: "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.958156 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-config" (OuterVolumeSpecName: "config") pod "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" (UID: "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.959302 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "06682504-c11d-41d1-838a-a336640770a8" (UID: "06682504-c11d-41d1-838a-a336640770a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.959643 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" (UID: "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.961506 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" (UID: "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.962505 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06682504-c11d-41d1-838a-a336640770a8-kube-api-access-nkr67" (OuterVolumeSpecName: "kube-api-access-nkr67") pod "06682504-c11d-41d1-838a-a336640770a8" (UID: "06682504-c11d-41d1-838a-a336640770a8"). InnerVolumeSpecName "kube-api-access-nkr67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.963614 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06682504-c11d-41d1-838a-a336640770a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "06682504-c11d-41d1-838a-a336640770a8" (UID: "06682504-c11d-41d1-838a-a336640770a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.964786 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-kube-api-access-ghnrs" (OuterVolumeSpecName: "kube-api-access-ghnrs") pod "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" (UID: "55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19"). InnerVolumeSpecName "kube-api-access-ghnrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.982701 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" event={"ID":"55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19","Type":"ContainerDied","Data":"32ee07ac1a814b003b417efa26b8aa0b63313ec96c2604225ce78123c0e1632f"} Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.982833 5127 scope.go:117] "RemoveContainer" containerID="34f6cce712a98ed1bf252839d852b104a9251477ae9239a79818d89dfcd5d94a" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.982991 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwdsw" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.985867 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" Feb 01 06:50:37 crc kubenswrapper[5127]: I0201 06:50:37.986503 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq" event={"ID":"06682504-c11d-41d1-838a-a336640770a8","Type":"ContainerDied","Data":"506c7d8d1982d04ce9caf44fa1765c365bfe0b7814fb93792ed63153405b0036"} Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.986673 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6v4qm" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.987563 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-l5p2j" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.988416 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bxqsq" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.989285 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fxlfq" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" Feb 01 06:50:37 crc kubenswrapper[5127]: E0201 06:50:37.989534 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-x4b5r" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.020084 5127 scope.go:117] "RemoveContainer" containerID="344e8cf1b56504e4f618103bc3bad4ebd1f6fc58d6453612a2b7c7203fb31887" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058697 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf6sn\" (UniqueName: \"kubernetes.io/projected/c94519f4-8cda-4d17-a851-bd4ff661f98b-kube-api-access-nf6sn\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058767 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-client-ca\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058800 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94519f4-8cda-4d17-a851-bd4ff661f98b-serving-cert\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058829 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-config\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058874 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058886 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058899 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058911 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkr67\" (UniqueName: \"kubernetes.io/projected/06682504-c11d-41d1-838a-a336640770a8-kube-api-access-nkr67\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058923 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghnrs\" (UniqueName: \"kubernetes.io/projected/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-kube-api-access-ghnrs\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058933 5127 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058943 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06682504-c11d-41d1-838a-a336640770a8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058952 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06682504-c11d-41d1-838a-a336640770a8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.058961 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.059824 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-client-ca\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.060061 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-config\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.062666 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94519f4-8cda-4d17-a851-bd4ff661f98b-serving-cert\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.075780 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf6sn\" (UniqueName: \"kubernetes.io/projected/c94519f4-8cda-4d17-a851-bd4ff661f98b-kube-api-access-nf6sn\") pod \"route-controller-manager-5b7cdbfd6d-7klkp\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.083047 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq"] Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.087921 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mpbrq"] Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.103862 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwdsw"] Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.103917 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwdsw"] Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.171308 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.197203 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ls5xc"] Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.199685 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.247204 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06682504-c11d-41d1-838a-a336640770a8" path="/var/lib/kubelet/pods/06682504-c11d-41d1-838a-a336640770a8/volumes" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.248310 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19" path="/var/lib/kubelet/pods/55a5e7b6-98c3-4fc7-ad7d-5f3ae23aaf19/volumes" Feb 01 06:50:38 crc kubenswrapper[5127]: I0201 06:50:38.362742 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp"] Feb 01 06:50:38 crc kubenswrapper[5127]: W0201 06:50:38.414898 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc94519f4_8cda_4d17_a851_bd4ff661f98b.slice/crio-e3e7e493263d77c8ff0629b7a2fef81b05ecd529060a0d57fc82eb4ad62f0d13 WatchSource:0}: Error finding container e3e7e493263d77c8ff0629b7a2fef81b05ecd529060a0d57fc82eb4ad62f0d13: Status 404 returned error can't find the container with id e3e7e493263d77c8ff0629b7a2fef81b05ecd529060a0d57fc82eb4ad62f0d13 Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.001254 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" event={"ID":"bafc814f-6c41-40cf-b3f4-8babc6ec840a","Type":"ContainerStarted","Data":"1b45f242bc6bfdf888eedb94fc534a9244ff866e23a712cdc1a113fe2e5fd4c9"} Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.001319 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" event={"ID":"bafc814f-6c41-40cf-b3f4-8babc6ec840a","Type":"ContainerStarted","Data":"4de7cd28fae517179d9a9898af13b0c4c9864d44f9bd184c8e8eacee79748387"} Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.003708 5127 generic.go:334] "Generic (PLEG): container finished" podID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerID="50a1cf66fceb6cbde7e81d09d160e3f57e1b3f770e2c9f17e44d3c2e44530812" exitCode=0 Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.003759 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdtzv" event={"ID":"3b3cb296-e043-482b-b6e0-50f0341eee73","Type":"ContainerDied","Data":"50a1cf66fceb6cbde7e81d09d160e3f57e1b3f770e2c9f17e44d3c2e44530812"} Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.006662 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" event={"ID":"c94519f4-8cda-4d17-a851-bd4ff661f98b","Type":"ContainerStarted","Data":"e3e7e493263d77c8ff0629b7a2fef81b05ecd529060a0d57fc82eb4ad62f0d13"} Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.958174 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bcb57b796-qblqz"] Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.959189 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.962243 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.962563 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.963053 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.963128 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.963265 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.963742 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.972577 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:50:39 crc kubenswrapper[5127]: I0201 06:50:39.972996 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bcb57b796-qblqz"] Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.019071 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ls5xc" event={"ID":"bafc814f-6c41-40cf-b3f4-8babc6ec840a","Type":"ContainerStarted","Data":"fe248490e1bd107e24becc5c17f06cc3b8c0e3ff14be09fff877c09cb8e266a2"} Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.022998 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdtzv" event={"ID":"3b3cb296-e043-482b-b6e0-50f0341eee73","Type":"ContainerStarted","Data":"c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800"} Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.024336 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" event={"ID":"c94519f4-8cda-4d17-a851-bd4ff661f98b","Type":"ContainerStarted","Data":"6ec57b9d0d06a47d01bcaf2cb20a895394f1da22f57b53a2e558fc118622ab0c"} Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.037739 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ls5xc" podStartSLOduration=170.037718803 podStartE2EDuration="2m50.037718803s" podCreationTimestamp="2026-02-01 06:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:40.03371301 +0000 UTC m=+190.519615383" watchObservedRunningTime="2026-02-01 06:50:40.037718803 +0000 UTC m=+190.523621166" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.072570 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rdtzv" podStartSLOduration=2.944242135 podStartE2EDuration="42.072550451s" podCreationTimestamp="2026-02-01 06:49:58 +0000 UTC" firstStartedPulling="2026-02-01 06:50:00.493103482 +0000 UTC m=+150.979005845" lastFinishedPulling="2026-02-01 06:50:39.621411798 +0000 UTC m=+190.107314161" observedRunningTime="2026-02-01 06:50:40.054751811 +0000 UTC m=+190.540654214" watchObservedRunningTime="2026-02-01 06:50:40.072550451 +0000 UTC m=+190.558452814" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.100733 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-client-ca\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.101029 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ssnf\" (UniqueName: \"kubernetes.io/projected/f79aa239-b7d5-4072-84d8-c02430bb1d80-kube-api-access-7ssnf\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.101378 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79aa239-b7d5-4072-84d8-c02430bb1d80-serving-cert\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.101476 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-proxy-ca-bundles\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.101504 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-config\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.202353 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79aa239-b7d5-4072-84d8-c02430bb1d80-serving-cert\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.202674 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-proxy-ca-bundles\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.203971 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-config\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.203908 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-proxy-ca-bundles\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.204881 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-client-ca\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.204995 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ssnf\" (UniqueName: \"kubernetes.io/projected/f79aa239-b7d5-4072-84d8-c02430bb1d80-kube-api-access-7ssnf\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.205252 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-config\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.205766 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-client-ca\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.207993 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79aa239-b7d5-4072-84d8-c02430bb1d80-serving-cert\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.222977 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ssnf\" (UniqueName: \"kubernetes.io/projected/f79aa239-b7d5-4072-84d8-c02430bb1d80-kube-api-access-7ssnf\") pod \"controller-manager-6bcb57b796-qblqz\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.280957 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.456374 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" podStartSLOduration=4.456357962 podStartE2EDuration="4.456357962s" podCreationTimestamp="2026-02-01 06:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:40.072916641 +0000 UTC m=+190.558819004" watchObservedRunningTime="2026-02-01 06:50:40.456357962 +0000 UTC m=+190.942260325" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.458492 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bcb57b796-qblqz"] Feb 01 06:50:40 crc kubenswrapper[5127]: W0201 06:50:40.464080 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf79aa239_b7d5_4072_84d8_c02430bb1d80.slice/crio-5d16ba1e071be501493954863b708807fe485104f3b0e162a40fe692a821df43 WatchSource:0}: Error finding container 5d16ba1e071be501493954863b708807fe485104f3b0e162a40fe692a821df43: Status 404 returned error can't find the container with id 5d16ba1e071be501493954863b708807fe485104f3b0e162a40fe692a821df43 Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.988812 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.989643 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.992009 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 06:50:40 crc kubenswrapper[5127]: I0201 06:50:40.992058 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.016358 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.016434 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.028803 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.041311 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" event={"ID":"f79aa239-b7d5-4072-84d8-c02430bb1d80","Type":"ContainerStarted","Data":"db9327bbf8ddb64ebe7d779689a1eade16d96f41627aa09b61453d08c14740f1"} Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.041344 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" event={"ID":"f79aa239-b7d5-4072-84d8-c02430bb1d80","Type":"ContainerStarted","Data":"5d16ba1e071be501493954863b708807fe485104f3b0e162a40fe692a821df43"} Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.046980 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.047600 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.066290 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.067660 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.089001 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" podStartSLOduration=5.088981563 podStartE2EDuration="5.088981563s" podCreationTimestamp="2026-02-01 06:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:41.065615937 +0000 UTC m=+191.551518300" watchObservedRunningTime="2026-02-01 06:50:41.088981563 +0000 UTC m=+191.574883926" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.119373 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.119502 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.120808 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.143303 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.302574 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:41 crc kubenswrapper[5127]: I0201 06:50:41.538494 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 06:50:42 crc kubenswrapper[5127]: I0201 06:50:42.058364 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"42d35ec3-3e94-4586-9c0a-515e3d3e41f2","Type":"ContainerStarted","Data":"b7e746fdf49dcd1243fca8d9588b15afdb8c036f3f9736f812e1d2ceffb25519"} Feb 01 06:50:42 crc kubenswrapper[5127]: I0201 06:50:42.059808 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"42d35ec3-3e94-4586-9c0a-515e3d3e41f2","Type":"ContainerStarted","Data":"a23a280ecbeff4c9888a7958b56f55f6c2856ad4015a34b8848acabc39a15378"} Feb 01 06:50:42 crc kubenswrapper[5127]: I0201 06:50:42.079220 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.079199198 podStartE2EDuration="2.079199198s" podCreationTimestamp="2026-02-01 06:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:42.077319626 +0000 UTC m=+192.563222019" watchObservedRunningTime="2026-02-01 06:50:42.079199198 +0000 UTC m=+192.565101571" Feb 01 06:50:43 crc kubenswrapper[5127]: I0201 06:50:43.064279 5127 generic.go:334] "Generic (PLEG): container finished" podID="42d35ec3-3e94-4586-9c0a-515e3d3e41f2" containerID="b7e746fdf49dcd1243fca8d9588b15afdb8c036f3f9736f812e1d2ceffb25519" exitCode=0 Feb 01 06:50:43 crc kubenswrapper[5127]: I0201 06:50:43.064394 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"42d35ec3-3e94-4586-9c0a-515e3d3e41f2","Type":"ContainerDied","Data":"b7e746fdf49dcd1243fca8d9588b15afdb8c036f3f9736f812e1d2ceffb25519"} Feb 01 06:50:44 crc kubenswrapper[5127]: I0201 06:50:44.305871 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:44 crc kubenswrapper[5127]: I0201 06:50:44.363422 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kubelet-dir\") pod \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " Feb 01 06:50:44 crc kubenswrapper[5127]: I0201 06:50:44.363513 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kube-api-access\") pod \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\" (UID: \"42d35ec3-3e94-4586-9c0a-515e3d3e41f2\") " Feb 01 06:50:44 crc kubenswrapper[5127]: I0201 06:50:44.363513 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "42d35ec3-3e94-4586-9c0a-515e3d3e41f2" (UID: "42d35ec3-3e94-4586-9c0a-515e3d3e41f2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:50:44 crc kubenswrapper[5127]: I0201 06:50:44.363719 5127 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:44 crc kubenswrapper[5127]: I0201 06:50:44.385797 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "42d35ec3-3e94-4586-9c0a-515e3d3e41f2" (UID: "42d35ec3-3e94-4586-9c0a-515e3d3e41f2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:44 crc kubenswrapper[5127]: I0201 06:50:44.465057 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/42d35ec3-3e94-4586-9c0a-515e3d3e41f2-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:45 crc kubenswrapper[5127]: I0201 06:50:45.079354 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"42d35ec3-3e94-4586-9c0a-515e3d3e41f2","Type":"ContainerDied","Data":"a23a280ecbeff4c9888a7958b56f55f6c2856ad4015a34b8848acabc39a15378"} Feb 01 06:50:45 crc kubenswrapper[5127]: I0201 06:50:45.079423 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a23a280ecbeff4c9888a7958b56f55f6c2856ad4015a34b8848acabc39a15378" Feb 01 06:50:45 crc kubenswrapper[5127]: I0201 06:50:45.079419 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.785743 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 06:50:46 crc kubenswrapper[5127]: E0201 06:50:46.785999 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d35ec3-3e94-4586-9c0a-515e3d3e41f2" containerName="pruner" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.786014 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d35ec3-3e94-4586-9c0a-515e3d3e41f2" containerName="pruner" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.786152 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d35ec3-3e94-4586-9c0a-515e3d3e41f2" containerName="pruner" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.786784 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.788698 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.789460 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.792757 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-var-lock\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.792833 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.792941 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kube-api-access\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.800963 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.894133 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-var-lock\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.894397 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.894496 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kube-api-access\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.894430 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.894322 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-var-lock\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:46 crc kubenswrapper[5127]: I0201 06:50:46.916436 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kube-api-access\") pod \"installer-9-crc\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:47 crc kubenswrapper[5127]: I0201 06:50:47.118689 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:50:47 crc kubenswrapper[5127]: I0201 06:50:47.635663 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 06:50:48 crc kubenswrapper[5127]: I0201 06:50:48.094792 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cd937e44-b5d4-420f-a6d7-6347e1fb3a10","Type":"ContainerStarted","Data":"fdfa2f0b50516c10c57769498e64f323e66f8c9e887875575cdb6d0cd7db93b6"} Feb 01 06:50:49 crc kubenswrapper[5127]: I0201 06:50:49.101227 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cd937e44-b5d4-420f-a6d7-6347e1fb3a10","Type":"ContainerStarted","Data":"df735347dab844e9f8f5b6b48642004b0250e57e386d1b93034ec3046a74a6b2"} Feb 01 06:50:49 crc kubenswrapper[5127]: I0201 06:50:49.119349 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.119322578 podStartE2EDuration="3.119322578s" podCreationTimestamp="2026-02-01 06:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:49.112923669 +0000 UTC m=+199.598826032" watchObservedRunningTime="2026-02-01 06:50:49.119322578 +0000 UTC m=+199.605224941" Feb 01 06:50:49 crc kubenswrapper[5127]: I0201 06:50:49.174186 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:50:49 crc kubenswrapper[5127]: I0201 06:50:49.174233 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:50:49 crc kubenswrapper[5127]: I0201 06:50:49.603058 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:50:50 crc kubenswrapper[5127]: I0201 06:50:50.164650 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:50:51 crc kubenswrapper[5127]: I0201 06:50:51.113763 5127 generic.go:334] "Generic (PLEG): container finished" podID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerID="b39329e91e0e982d2c770b0ed534d4c48a0f35309010a9e106e5205fb9d3881a" exitCode=0 Feb 01 06:50:51 crc kubenswrapper[5127]: I0201 06:50:51.113837 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxqsq" event={"ID":"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30","Type":"ContainerDied","Data":"b39329e91e0e982d2c770b0ed534d4c48a0f35309010a9e106e5205fb9d3881a"} Feb 01 06:50:51 crc kubenswrapper[5127]: I0201 06:50:51.116085 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzj87" event={"ID":"6dd3d1fb-13f1-452e-afa2-580c6d736be3","Type":"ContainerStarted","Data":"5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414"} Feb 01 06:50:52 crc kubenswrapper[5127]: I0201 06:50:52.123374 5127 generic.go:334] "Generic (PLEG): container finished" podID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerID="5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414" exitCode=0 Feb 01 06:50:52 crc kubenswrapper[5127]: I0201 06:50:52.123650 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzj87" event={"ID":"6dd3d1fb-13f1-452e-afa2-580c6d736be3","Type":"ContainerDied","Data":"5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.148441 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxqsq" event={"ID":"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30","Type":"ContainerStarted","Data":"caffc3b590bfb73029801dc2def95a720ca706e5dc8253486ac705faff6fe8da"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.150155 5127 generic.go:334] "Generic (PLEG): container finished" podID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerID="9e105f34e231a90950313a52d8dacb845b606144921dc76df1edef54bdc93d79" exitCode=0 Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.150197 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxlfq" event={"ID":"e0285e4e-ca44-40ee-aaad-4c2bef41ce24","Type":"ContainerDied","Data":"9e105f34e231a90950313a52d8dacb845b606144921dc76df1edef54bdc93d79"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.153111 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzj87" event={"ID":"6dd3d1fb-13f1-452e-afa2-580c6d736be3","Type":"ContainerStarted","Data":"910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.156684 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerID="85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3" exitCode=0 Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.156736 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5p2j" event={"ID":"1c2c6b95-9c35-4fa1-be58-4e825fd86e97","Type":"ContainerDied","Data":"85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.159811 5127 generic.go:334] "Generic (PLEG): container finished" podID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerID="e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5" exitCode=0 Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.159840 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4b5r" event={"ID":"40d7e137-a822-4ed9-b1cf-a123d53e4122","Type":"ContainerDied","Data":"e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.163861 5127 generic.go:334] "Generic (PLEG): container finished" podID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerID="66da19ad99956146c19fd6a2610496c65cc643853d3c4eebade73e0a4286c772" exitCode=0 Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.163910 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v4qm" event={"ID":"ac3c627d-681a-4008-9bda-5e5f3af5aafd","Type":"ContainerDied","Data":"66da19ad99956146c19fd6a2610496c65cc643853d3c4eebade73e0a4286c772"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.168954 5127 generic.go:334] "Generic (PLEG): container finished" podID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerID="fa6b9d11a0e4db31140967403504baeb16c01b4c9f05807327e292dfbee4ff19" exitCode=0 Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.169005 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjscz" event={"ID":"02d7fc8d-87e8-455b-9f99-fde65167beea","Type":"ContainerDied","Data":"fa6b9d11a0e4db31140967403504baeb16c01b4c9f05807327e292dfbee4ff19"} Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.179284 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bxqsq" podStartSLOduration=3.861649476 podStartE2EDuration="55.179268035s" podCreationTimestamp="2026-02-01 06:50:00 +0000 UTC" firstStartedPulling="2026-02-01 06:50:02.633326032 +0000 UTC m=+153.119228395" lastFinishedPulling="2026-02-01 06:50:53.950944551 +0000 UTC m=+204.436846954" observedRunningTime="2026-02-01 06:50:55.178372189 +0000 UTC m=+205.664274552" watchObservedRunningTime="2026-02-01 06:50:55.179268035 +0000 UTC m=+205.665170398" Feb 01 06:50:55 crc kubenswrapper[5127]: I0201 06:50:55.223638 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dzj87" podStartSLOduration=3.773953011 podStartE2EDuration="57.22361628s" podCreationTimestamp="2026-02-01 06:49:58 +0000 UTC" firstStartedPulling="2026-02-01 06:50:00.501787676 +0000 UTC m=+150.987690039" lastFinishedPulling="2026-02-01 06:50:53.951450905 +0000 UTC m=+204.437353308" observedRunningTime="2026-02-01 06:50:55.220561034 +0000 UTC m=+205.706463417" watchObservedRunningTime="2026-02-01 06:50:55.22361628 +0000 UTC m=+205.709518643" Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.193609 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5p2j" event={"ID":"1c2c6b95-9c35-4fa1-be58-4e825fd86e97","Type":"ContainerStarted","Data":"c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28"} Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.197621 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4b5r" event={"ID":"40d7e137-a822-4ed9-b1cf-a123d53e4122","Type":"ContainerStarted","Data":"8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82"} Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.199868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v4qm" event={"ID":"ac3c627d-681a-4008-9bda-5e5f3af5aafd","Type":"ContainerStarted","Data":"259a92deb2ca7af706d6b2ebbaaa42c7c955da0b8d665f348aa9ac9a6186e445"} Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.202000 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjscz" event={"ID":"02d7fc8d-87e8-455b-9f99-fde65167beea","Type":"ContainerStarted","Data":"f966fd9dbf355d0000f9d2bf533f58d6715ff5f09f44884903e448bb56030f91"} Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.211979 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxlfq" event={"ID":"e0285e4e-ca44-40ee-aaad-4c2bef41ce24","Type":"ContainerStarted","Data":"996689bad838905740ac144e399197cbbda1ccf618eb5560403521748a7ce044"} Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.215990 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l5p2j" podStartSLOduration=2.26551524 podStartE2EDuration="54.215971776s" podCreationTimestamp="2026-02-01 06:50:02 +0000 UTC" firstStartedPulling="2026-02-01 06:50:03.657638805 +0000 UTC m=+154.143541168" lastFinishedPulling="2026-02-01 06:50:55.608095331 +0000 UTC m=+206.093997704" observedRunningTime="2026-02-01 06:50:56.21255521 +0000 UTC m=+206.698457573" watchObservedRunningTime="2026-02-01 06:50:56.215971776 +0000 UTC m=+206.701874149" Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.232270 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x4b5r" podStartSLOduration=2.033727327 podStartE2EDuration="57.232252103s" podCreationTimestamp="2026-02-01 06:49:59 +0000 UTC" firstStartedPulling="2026-02-01 06:50:00.476709961 +0000 UTC m=+150.962612324" lastFinishedPulling="2026-02-01 06:50:55.675234737 +0000 UTC m=+206.161137100" observedRunningTime="2026-02-01 06:50:56.229638621 +0000 UTC m=+206.715540984" watchObservedRunningTime="2026-02-01 06:50:56.232252103 +0000 UTC m=+206.718154466" Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.252209 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jjscz" podStartSLOduration=3.354046108 podStartE2EDuration="55.252192254s" podCreationTimestamp="2026-02-01 06:50:01 +0000 UTC" firstStartedPulling="2026-02-01 06:50:03.657570233 +0000 UTC m=+154.143472596" lastFinishedPulling="2026-02-01 06:50:55.555716379 +0000 UTC m=+206.041618742" observedRunningTime="2026-02-01 06:50:56.246505254 +0000 UTC m=+206.732407617" watchObservedRunningTime="2026-02-01 06:50:56.252192254 +0000 UTC m=+206.738094617" Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.265227 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6v4qm" podStartSLOduration=3.168519354 podStartE2EDuration="58.265210399s" podCreationTimestamp="2026-02-01 06:49:58 +0000 UTC" firstStartedPulling="2026-02-01 06:50:00.487102173 +0000 UTC m=+150.973004536" lastFinishedPulling="2026-02-01 06:50:55.583793188 +0000 UTC m=+206.069695581" observedRunningTime="2026-02-01 06:50:56.264714435 +0000 UTC m=+206.750616828" watchObservedRunningTime="2026-02-01 06:50:56.265210399 +0000 UTC m=+206.751112762" Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.283300 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fxlfq" podStartSLOduration=2.25524568 podStartE2EDuration="55.283274677s" podCreationTimestamp="2026-02-01 06:50:01 +0000 UTC" firstStartedPulling="2026-02-01 06:50:02.647123648 +0000 UTC m=+153.133026011" lastFinishedPulling="2026-02-01 06:50:55.675152645 +0000 UTC m=+206.161055008" observedRunningTime="2026-02-01 06:50:56.280222632 +0000 UTC m=+206.766124995" watchObservedRunningTime="2026-02-01 06:50:56.283274677 +0000 UTC m=+206.769177040" Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.573762 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcb57b796-qblqz"] Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.574565 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" podUID="f79aa239-b7d5-4072-84d8-c02430bb1d80" containerName="controller-manager" containerID="cri-o://db9327bbf8ddb64ebe7d779689a1eade16d96f41627aa09b61453d08c14740f1" gracePeriod=30 Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.600881 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp"] Feb 01 06:50:56 crc kubenswrapper[5127]: I0201 06:50:56.601364 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" podUID="c94519f4-8cda-4d17-a851-bd4ff661f98b" containerName="route-controller-manager" containerID="cri-o://6ec57b9d0d06a47d01bcaf2cb20a895394f1da22f57b53a2e558fc118622ab0c" gracePeriod=30 Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.223392 5127 generic.go:334] "Generic (PLEG): container finished" podID="c94519f4-8cda-4d17-a851-bd4ff661f98b" containerID="6ec57b9d0d06a47d01bcaf2cb20a895394f1da22f57b53a2e558fc118622ab0c" exitCode=0 Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.223446 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" event={"ID":"c94519f4-8cda-4d17-a851-bd4ff661f98b","Type":"ContainerDied","Data":"6ec57b9d0d06a47d01bcaf2cb20a895394f1da22f57b53a2e558fc118622ab0c"} Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.224606 5127 generic.go:334] "Generic (PLEG): container finished" podID="f79aa239-b7d5-4072-84d8-c02430bb1d80" containerID="db9327bbf8ddb64ebe7d779689a1eade16d96f41627aa09b61453d08c14740f1" exitCode=0 Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.224629 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" event={"ID":"f79aa239-b7d5-4072-84d8-c02430bb1d80","Type":"ContainerDied","Data":"db9327bbf8ddb64ebe7d779689a1eade16d96f41627aa09b61453d08c14740f1"} Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.631492 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.637736 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764145 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-client-ca\") pod \"c94519f4-8cda-4d17-a851-bd4ff661f98b\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764212 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf6sn\" (UniqueName: \"kubernetes.io/projected/c94519f4-8cda-4d17-a851-bd4ff661f98b-kube-api-access-nf6sn\") pod \"c94519f4-8cda-4d17-a851-bd4ff661f98b\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764281 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-client-ca\") pod \"f79aa239-b7d5-4072-84d8-c02430bb1d80\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764321 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94519f4-8cda-4d17-a851-bd4ff661f98b-serving-cert\") pod \"c94519f4-8cda-4d17-a851-bd4ff661f98b\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764359 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ssnf\" (UniqueName: \"kubernetes.io/projected/f79aa239-b7d5-4072-84d8-c02430bb1d80-kube-api-access-7ssnf\") pod \"f79aa239-b7d5-4072-84d8-c02430bb1d80\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764378 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-config\") pod \"f79aa239-b7d5-4072-84d8-c02430bb1d80\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764405 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79aa239-b7d5-4072-84d8-c02430bb1d80-serving-cert\") pod \"f79aa239-b7d5-4072-84d8-c02430bb1d80\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764425 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-config\") pod \"c94519f4-8cda-4d17-a851-bd4ff661f98b\" (UID: \"c94519f4-8cda-4d17-a851-bd4ff661f98b\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.764444 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-proxy-ca-bundles\") pod \"f79aa239-b7d5-4072-84d8-c02430bb1d80\" (UID: \"f79aa239-b7d5-4072-84d8-c02430bb1d80\") " Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.765145 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f79aa239-b7d5-4072-84d8-c02430bb1d80" (UID: "f79aa239-b7d5-4072-84d8-c02430bb1d80"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.765167 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-config" (OuterVolumeSpecName: "config") pod "c94519f4-8cda-4d17-a851-bd4ff661f98b" (UID: "c94519f4-8cda-4d17-a851-bd4ff661f98b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.765197 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-client-ca" (OuterVolumeSpecName: "client-ca") pod "f79aa239-b7d5-4072-84d8-c02430bb1d80" (UID: "f79aa239-b7d5-4072-84d8-c02430bb1d80"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.765786 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-config" (OuterVolumeSpecName: "config") pod "f79aa239-b7d5-4072-84d8-c02430bb1d80" (UID: "f79aa239-b7d5-4072-84d8-c02430bb1d80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.765781 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c94519f4-8cda-4d17-a851-bd4ff661f98b" (UID: "c94519f4-8cda-4d17-a851-bd4ff661f98b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.769914 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94519f4-8cda-4d17-a851-bd4ff661f98b-kube-api-access-nf6sn" (OuterVolumeSpecName: "kube-api-access-nf6sn") pod "c94519f4-8cda-4d17-a851-bd4ff661f98b" (UID: "c94519f4-8cda-4d17-a851-bd4ff661f98b"). InnerVolumeSpecName "kube-api-access-nf6sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.770166 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94519f4-8cda-4d17-a851-bd4ff661f98b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c94519f4-8cda-4d17-a851-bd4ff661f98b" (UID: "c94519f4-8cda-4d17-a851-bd4ff661f98b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.770644 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79aa239-b7d5-4072-84d8-c02430bb1d80-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f79aa239-b7d5-4072-84d8-c02430bb1d80" (UID: "f79aa239-b7d5-4072-84d8-c02430bb1d80"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.776789 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79aa239-b7d5-4072-84d8-c02430bb1d80-kube-api-access-7ssnf" (OuterVolumeSpecName: "kube-api-access-7ssnf") pod "f79aa239-b7d5-4072-84d8-c02430bb1d80" (UID: "f79aa239-b7d5-4072-84d8-c02430bb1d80"). InnerVolumeSpecName "kube-api-access-7ssnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865528 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865570 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf6sn\" (UniqueName: \"kubernetes.io/projected/c94519f4-8cda-4d17-a851-bd4ff661f98b-kube-api-access-nf6sn\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865596 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865604 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94519f4-8cda-4d17-a851-bd4ff661f98b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865615 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ssnf\" (UniqueName: \"kubernetes.io/projected/f79aa239-b7d5-4072-84d8-c02430bb1d80-kube-api-access-7ssnf\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865624 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865632 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f79aa239-b7d5-4072-84d8-c02430bb1d80-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865641 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94519f4-8cda-4d17-a851-bd4ff661f98b-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.865649 5127 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f79aa239-b7d5-4072-84d8-c02430bb1d80-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.977278 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96"] Feb 01 06:50:57 crc kubenswrapper[5127]: E0201 06:50:57.977567 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f79aa239-b7d5-4072-84d8-c02430bb1d80" containerName="controller-manager" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.977604 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79aa239-b7d5-4072-84d8-c02430bb1d80" containerName="controller-manager" Feb 01 06:50:57 crc kubenswrapper[5127]: E0201 06:50:57.977617 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94519f4-8cda-4d17-a851-bd4ff661f98b" containerName="route-controller-manager" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.977625 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94519f4-8cda-4d17-a851-bd4ff661f98b" containerName="route-controller-manager" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.977743 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94519f4-8cda-4d17-a851-bd4ff661f98b" containerName="route-controller-manager" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.977761 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f79aa239-b7d5-4072-84d8-c02430bb1d80" containerName="controller-manager" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.978159 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:57 crc kubenswrapper[5127]: I0201 06:50:57.997698 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96"] Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.168534 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-client-ca\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.168577 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60525c9b-844f-42f5-a287-486e391b6121-serving-cert\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.168850 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-config\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.168946 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq4zb\" (UniqueName: \"kubernetes.io/projected/60525c9b-844f-42f5-a287-486e391b6121-kube-api-access-qq4zb\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.230991 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" event={"ID":"f79aa239-b7d5-4072-84d8-c02430bb1d80","Type":"ContainerDied","Data":"5d16ba1e071be501493954863b708807fe485104f3b0e162a40fe692a821df43"} Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.231018 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcb57b796-qblqz" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.231058 5127 scope.go:117] "RemoveContainer" containerID="db9327bbf8ddb64ebe7d779689a1eade16d96f41627aa09b61453d08c14740f1" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.232647 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" event={"ID":"c94519f4-8cda-4d17-a851-bd4ff661f98b","Type":"ContainerDied","Data":"e3e7e493263d77c8ff0629b7a2fef81b05ecd529060a0d57fc82eb4ad62f0d13"} Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.232720 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.246878 5127 scope.go:117] "RemoveContainer" containerID="6ec57b9d0d06a47d01bcaf2cb20a895394f1da22f57b53a2e558fc118622ab0c" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.261535 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcb57b796-qblqz"] Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.272094 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bcb57b796-qblqz"] Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.275538 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-config\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.275766 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq4zb\" (UniqueName: \"kubernetes.io/projected/60525c9b-844f-42f5-a287-486e391b6121-kube-api-access-qq4zb\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.275916 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-client-ca\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.276023 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60525c9b-844f-42f5-a287-486e391b6121-serving-cert\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.277076 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-client-ca\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.279873 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-config\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.288281 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp"] Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.288399 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7cdbfd6d-7klkp"] Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.290911 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60525c9b-844f-42f5-a287-486e391b6121-serving-cert\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.292201 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq4zb\" (UniqueName: \"kubernetes.io/projected/60525c9b-844f-42f5-a287-486e391b6121-kube-api-access-qq4zb\") pod \"route-controller-manager-6dd66ff84d-phn96\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.591051 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.955860 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:50:58 crc kubenswrapper[5127]: I0201 06:50:58.956120 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.027005 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.056750 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96"] Feb 01 06:50:59 crc kubenswrapper[5127]: W0201 06:50:59.059708 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60525c9b_844f_42f5_a287_486e391b6121.slice/crio-ec6ebf71e2dd9d8dc3d38aa96cb1e267b54648fef3b9a962725ead1a6ffdb716 WatchSource:0}: Error finding container ec6ebf71e2dd9d8dc3d38aa96cb1e267b54648fef3b9a962725ead1a6ffdb716: Status 404 returned error can't find the container with id ec6ebf71e2dd9d8dc3d38aa96cb1e267b54648fef3b9a962725ead1a6ffdb716 Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.239432 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" event={"ID":"60525c9b-844f-42f5-a287-486e391b6121","Type":"ContainerStarted","Data":"ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1"} Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.239708 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.239719 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" event={"ID":"60525c9b-844f-42f5-a287-486e391b6121","Type":"ContainerStarted","Data":"ec6ebf71e2dd9d8dc3d38aa96cb1e267b54648fef3b9a962725ead1a6ffdb716"} Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.241852 5127 patch_prober.go:28] interesting pod/route-controller-manager-6dd66ff84d-phn96 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.241885 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" podUID="60525c9b-844f-42f5-a287-486e391b6121" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.258150 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" podStartSLOduration=3.258128771 podStartE2EDuration="3.258128771s" podCreationTimestamp="2026-02-01 06:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:50:59.254693853 +0000 UTC m=+209.740596216" watchObservedRunningTime="2026-02-01 06:50:59.258128771 +0000 UTC m=+209.744031154" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.541276 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.541726 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.591189 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.643927 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.643961 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.709759 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.980396 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b8789c864-fbk6c"] Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.981223 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.984910 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.988829 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.989048 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.989428 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.990878 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:50:59 crc kubenswrapper[5127]: I0201 06:50:59.997440 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.004334 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b8789c864-fbk6c"] Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.006048 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.097834 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-client-ca\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.097909 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-config\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.097942 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2crd\" (UniqueName: \"kubernetes.io/projected/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-kube-api-access-n2crd\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.097968 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-proxy-ca-bundles\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.098020 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-serving-cert\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.199119 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-config\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.199178 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2crd\" (UniqueName: \"kubernetes.io/projected/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-kube-api-access-n2crd\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.199246 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-proxy-ca-bundles\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.200766 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-proxy-ca-bundles\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.199278 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-serving-cert\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.201010 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-client-ca\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.201014 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-config\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.201583 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-client-ca\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.207796 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-serving-cert\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.216088 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2crd\" (UniqueName: \"kubernetes.io/projected/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-kube-api-access-n2crd\") pod \"controller-manager-b8789c864-fbk6c\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.243660 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94519f4-8cda-4d17-a851-bd4ff661f98b" path="/var/lib/kubelet/pods/c94519f4-8cda-4d17-a851-bd4ff661f98b/volumes" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.244372 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79aa239-b7d5-4072-84d8-c02430bb1d80" path="/var/lib/kubelet/pods/f79aa239-b7d5-4072-84d8-c02430bb1d80/volumes" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.252555 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.293847 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.293922 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.305271 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:00 crc kubenswrapper[5127]: I0201 06:51:00.577352 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b8789c864-fbk6c"] Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.076945 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.077522 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.126556 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.252215 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" event={"ID":"7de7ca86-36e7-478d-8bb8-1aba1a4058f1","Type":"ContainerStarted","Data":"f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015"} Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.252273 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" event={"ID":"7de7ca86-36e7-478d-8bb8-1aba1a4058f1","Type":"ContainerStarted","Data":"0a93482a8174e53e53c605fffa5e48e723edec381767826bfcc15320fc39a337"} Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.288879 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.558131 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.558199 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:51:01 crc kubenswrapper[5127]: I0201 06:51:01.608170 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:51:02 crc kubenswrapper[5127]: I0201 06:51:02.123280 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:51:02 crc kubenswrapper[5127]: I0201 06:51:02.123653 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:51:02 crc kubenswrapper[5127]: I0201 06:51:02.285080 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" podStartSLOduration=6.285065108 podStartE2EDuration="6.285065108s" podCreationTimestamp="2026-02-01 06:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:51:02.284968245 +0000 UTC m=+212.770870618" watchObservedRunningTime="2026-02-01 06:51:02.285065108 +0000 UTC m=+212.770967461" Feb 01 06:51:02 crc kubenswrapper[5127]: I0201 06:51:02.308904 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:51:02 crc kubenswrapper[5127]: I0201 06:51:02.543837 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:51:02 crc kubenswrapper[5127]: I0201 06:51:02.544103 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:51:02 crc kubenswrapper[5127]: I0201 06:51:02.680734 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4b5r"] Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.175519 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jjscz" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="registry-server" probeResult="failure" output=< Feb 01 06:51:03 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 06:51:03 crc kubenswrapper[5127]: > Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.276700 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x4b5r" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="registry-server" containerID="cri-o://8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82" gracePeriod=2 Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.601009 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-l5p2j" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="registry-server" probeResult="failure" output=< Feb 01 06:51:03 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 06:51:03 crc kubenswrapper[5127]: > Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.799113 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.868326 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qxpv\" (UniqueName: \"kubernetes.io/projected/40d7e137-a822-4ed9-b1cf-a123d53e4122-kube-api-access-5qxpv\") pod \"40d7e137-a822-4ed9-b1cf-a123d53e4122\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.868363 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-catalog-content\") pod \"40d7e137-a822-4ed9-b1cf-a123d53e4122\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.868397 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-utilities\") pod \"40d7e137-a822-4ed9-b1cf-a123d53e4122\" (UID: \"40d7e137-a822-4ed9-b1cf-a123d53e4122\") " Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.869167 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-utilities" (OuterVolumeSpecName: "utilities") pod "40d7e137-a822-4ed9-b1cf-a123d53e4122" (UID: "40d7e137-a822-4ed9-b1cf-a123d53e4122"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.876106 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40d7e137-a822-4ed9-b1cf-a123d53e4122-kube-api-access-5qxpv" (OuterVolumeSpecName: "kube-api-access-5qxpv") pod "40d7e137-a822-4ed9-b1cf-a123d53e4122" (UID: "40d7e137-a822-4ed9-b1cf-a123d53e4122"). InnerVolumeSpecName "kube-api-access-5qxpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.943450 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40d7e137-a822-4ed9-b1cf-a123d53e4122" (UID: "40d7e137-a822-4ed9-b1cf-a123d53e4122"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.969504 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qxpv\" (UniqueName: \"kubernetes.io/projected/40d7e137-a822-4ed9-b1cf-a123d53e4122-kube-api-access-5qxpv\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.969526 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:03 crc kubenswrapper[5127]: I0201 06:51:03.969535 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40d7e137-a822-4ed9-b1cf-a123d53e4122-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.286680 5127 generic.go:334] "Generic (PLEG): container finished" podID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerID="8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82" exitCode=0 Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.286746 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4b5r" event={"ID":"40d7e137-a822-4ed9-b1cf-a123d53e4122","Type":"ContainerDied","Data":"8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82"} Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.286789 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4b5r" event={"ID":"40d7e137-a822-4ed9-b1cf-a123d53e4122","Type":"ContainerDied","Data":"c65ccf9628a04f47c291e8c03b4e4e77476bc37914b356e46721887c71b6a292"} Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.286813 5127 scope.go:117] "RemoveContainer" containerID="8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.287248 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4b5r" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.317796 5127 scope.go:117] "RemoveContainer" containerID="e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.320032 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4b5r"] Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.328520 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x4b5r"] Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.341037 5127 scope.go:117] "RemoveContainer" containerID="095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.376917 5127 scope.go:117] "RemoveContainer" containerID="8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82" Feb 01 06:51:04 crc kubenswrapper[5127]: E0201 06:51:04.377866 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82\": container with ID starting with 8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82 not found: ID does not exist" containerID="8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.377925 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82"} err="failed to get container status \"8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82\": rpc error: code = NotFound desc = could not find container \"8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82\": container with ID starting with 8c8f4c948adf6d43d95490143fc5bd8b3737059ddab6d8a1f5467233ae3ffa82 not found: ID does not exist" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.377987 5127 scope.go:117] "RemoveContainer" containerID="e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5" Feb 01 06:51:04 crc kubenswrapper[5127]: E0201 06:51:04.379017 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5\": container with ID starting with e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5 not found: ID does not exist" containerID="e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.379119 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5"} err="failed to get container status \"e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5\": rpc error: code = NotFound desc = could not find container \"e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5\": container with ID starting with e6dad1827271cca6caada45557f1c255029da507c21995f0253a0163d83a81a5 not found: ID does not exist" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.379446 5127 scope.go:117] "RemoveContainer" containerID="095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d" Feb 01 06:51:04 crc kubenswrapper[5127]: E0201 06:51:04.380164 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d\": container with ID starting with 095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d not found: ID does not exist" containerID="095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.380202 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d"} err="failed to get container status \"095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d\": rpc error: code = NotFound desc = could not find container \"095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d\": container with ID starting with 095ca8f6f37e406e0db9fbed17c628ec93e0f1357a28c558127bad59f22fa58d not found: ID does not exist" Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.486401 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dzj87"] Feb 01 06:51:04 crc kubenswrapper[5127]: I0201 06:51:04.487200 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dzj87" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="registry-server" containerID="cri-o://910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d" gracePeriod=2 Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.004148 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.077098 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxlfq"] Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.077691 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fxlfq" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="registry-server" containerID="cri-o://996689bad838905740ac144e399197cbbda1ccf618eb5560403521748a7ce044" gracePeriod=2 Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.086360 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn4fr\" (UniqueName: \"kubernetes.io/projected/6dd3d1fb-13f1-452e-afa2-580c6d736be3-kube-api-access-tn4fr\") pod \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.086458 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-utilities\") pod \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.086521 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-catalog-content\") pod \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\" (UID: \"6dd3d1fb-13f1-452e-afa2-580c6d736be3\") " Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.087438 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-utilities" (OuterVolumeSpecName: "utilities") pod "6dd3d1fb-13f1-452e-afa2-580c6d736be3" (UID: "6dd3d1fb-13f1-452e-afa2-580c6d736be3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.092445 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd3d1fb-13f1-452e-afa2-580c6d736be3-kube-api-access-tn4fr" (OuterVolumeSpecName: "kube-api-access-tn4fr") pod "6dd3d1fb-13f1-452e-afa2-580c6d736be3" (UID: "6dd3d1fb-13f1-452e-afa2-580c6d736be3"). InnerVolumeSpecName "kube-api-access-tn4fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.140877 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dd3d1fb-13f1-452e-afa2-580c6d736be3" (UID: "6dd3d1fb-13f1-452e-afa2-580c6d736be3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.187929 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn4fr\" (UniqueName: \"kubernetes.io/projected/6dd3d1fb-13f1-452e-afa2-580c6d736be3-kube-api-access-tn4fr\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.187968 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.187980 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dd3d1fb-13f1-452e-afa2-580c6d736be3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.298455 5127 generic.go:334] "Generic (PLEG): container finished" podID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerID="996689bad838905740ac144e399197cbbda1ccf618eb5560403521748a7ce044" exitCode=0 Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.298519 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxlfq" event={"ID":"e0285e4e-ca44-40ee-aaad-4c2bef41ce24","Type":"ContainerDied","Data":"996689bad838905740ac144e399197cbbda1ccf618eb5560403521748a7ce044"} Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.300900 5127 generic.go:334] "Generic (PLEG): container finished" podID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerID="910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d" exitCode=0 Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.300930 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzj87" event={"ID":"6dd3d1fb-13f1-452e-afa2-580c6d736be3","Type":"ContainerDied","Data":"910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d"} Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.300969 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dzj87" event={"ID":"6dd3d1fb-13f1-452e-afa2-580c6d736be3","Type":"ContainerDied","Data":"0d4959dd54f527301ca9d76447a8829d6de382441c13dcb2c8685fbcd0df03b0"} Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.300996 5127 scope.go:117] "RemoveContainer" containerID="910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.301092 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dzj87" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.338832 5127 scope.go:117] "RemoveContainer" containerID="5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.355112 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dzj87"] Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.364910 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dzj87"] Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.384979 5127 scope.go:117] "RemoveContainer" containerID="0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.407011 5127 scope.go:117] "RemoveContainer" containerID="910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d" Feb 01 06:51:05 crc kubenswrapper[5127]: E0201 06:51:05.415782 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d\": container with ID starting with 910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d not found: ID does not exist" containerID="910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.415832 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d"} err="failed to get container status \"910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d\": rpc error: code = NotFound desc = could not find container \"910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d\": container with ID starting with 910b5c04450d1be618c8297a4fee798166b776f56f70b2d8aae9e645f5d6e70d not found: ID does not exist" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.415869 5127 scope.go:117] "RemoveContainer" containerID="5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414" Feb 01 06:51:05 crc kubenswrapper[5127]: E0201 06:51:05.416630 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414\": container with ID starting with 5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414 not found: ID does not exist" containerID="5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.416668 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414"} err="failed to get container status \"5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414\": rpc error: code = NotFound desc = could not find container \"5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414\": container with ID starting with 5b0bf47e5116d8b10225392bd9844ba75438776cf55f671272b3b190517d8414 not found: ID does not exist" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.416699 5127 scope.go:117] "RemoveContainer" containerID="0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1" Feb 01 06:51:05 crc kubenswrapper[5127]: E0201 06:51:05.421758 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1\": container with ID starting with 0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1 not found: ID does not exist" containerID="0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.421799 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1"} err="failed to get container status \"0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1\": rpc error: code = NotFound desc = could not find container \"0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1\": container with ID starting with 0dd55df3816d3a0d7c9b63f1aa575a146cddd1f380bb172d51db7251e63d22d1 not found: ID does not exist" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.565635 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.603679 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-utilities\") pod \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.603804 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95lhk\" (UniqueName: \"kubernetes.io/projected/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-kube-api-access-95lhk\") pod \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.603839 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-catalog-content\") pod \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\" (UID: \"e0285e4e-ca44-40ee-aaad-4c2bef41ce24\") " Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.604684 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-utilities" (OuterVolumeSpecName: "utilities") pod "e0285e4e-ca44-40ee-aaad-4c2bef41ce24" (UID: "e0285e4e-ca44-40ee-aaad-4c2bef41ce24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.608175 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-kube-api-access-95lhk" (OuterVolumeSpecName: "kube-api-access-95lhk") pod "e0285e4e-ca44-40ee-aaad-4c2bef41ce24" (UID: "e0285e4e-ca44-40ee-aaad-4c2bef41ce24"). InnerVolumeSpecName "kube-api-access-95lhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.635245 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0285e4e-ca44-40ee-aaad-4c2bef41ce24" (UID: "e0285e4e-ca44-40ee-aaad-4c2bef41ce24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.705652 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.705716 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95lhk\" (UniqueName: \"kubernetes.io/projected/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-kube-api-access-95lhk\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:05 crc kubenswrapper[5127]: I0201 06:51:05.705735 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0285e4e-ca44-40ee-aaad-4c2bef41ce24-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.242312 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" path="/var/lib/kubelet/pods/40d7e137-a822-4ed9-b1cf-a123d53e4122/volumes" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.243388 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" path="/var/lib/kubelet/pods/6dd3d1fb-13f1-452e-afa2-580c6d736be3/volumes" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.307890 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fxlfq" event={"ID":"e0285e4e-ca44-40ee-aaad-4c2bef41ce24","Type":"ContainerDied","Data":"60a56b4fdabaa2b15e68fa9b6ba0e7e0fccc9b39a50b1e23a4059deb1e83a059"} Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.307941 5127 scope.go:117] "RemoveContainer" containerID="996689bad838905740ac144e399197cbbda1ccf618eb5560403521748a7ce044" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.307961 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fxlfq" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.321811 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxlfq"] Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.324194 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fxlfq"] Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.327107 5127 scope.go:117] "RemoveContainer" containerID="9e105f34e231a90950313a52d8dacb845b606144921dc76df1edef54bdc93d79" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.346739 5127 scope.go:117] "RemoveContainer" containerID="dd57f9a6801dcbd62391abf651a039319f17cad6921122a1645808661a6a8695" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.741882 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.742014 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.742101 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.743075 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:51:06 crc kubenswrapper[5127]: I0201 06:51:06.743209 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786" gracePeriod=600 Feb 01 06:51:07 crc kubenswrapper[5127]: I0201 06:51:07.321162 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786" exitCode=0 Feb 01 06:51:07 crc kubenswrapper[5127]: I0201 06:51:07.321242 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786"} Feb 01 06:51:08 crc kubenswrapper[5127]: I0201 06:51:08.250431 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" path="/var/lib/kubelet/pods/e0285e4e-ca44-40ee-aaad-4c2bef41ce24/volumes" Feb 01 06:51:09 crc kubenswrapper[5127]: I0201 06:51:09.003519 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:51:09 crc kubenswrapper[5127]: I0201 06:51:09.340075 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"44b5c7dc6fc8fd58e9f7997c01ccc01c8419c0923e9ed17a53aef39ef44af5ca"} Feb 01 06:51:10 crc kubenswrapper[5127]: I0201 06:51:10.306260 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:10 crc kubenswrapper[5127]: I0201 06:51:10.312290 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:11 crc kubenswrapper[5127]: I0201 06:51:11.791656 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz642"] Feb 01 06:51:12 crc kubenswrapper[5127]: I0201 06:51:12.165723 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:51:12 crc kubenswrapper[5127]: I0201 06:51:12.205027 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:51:12 crc kubenswrapper[5127]: I0201 06:51:12.625871 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:51:12 crc kubenswrapper[5127]: I0201 06:51:12.705325 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:51:13 crc kubenswrapper[5127]: I0201 06:51:13.616486 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5p2j"] Feb 01 06:51:14 crc kubenswrapper[5127]: I0201 06:51:14.372062 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l5p2j" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="registry-server" containerID="cri-o://c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28" gracePeriod=2 Feb 01 06:51:14 crc kubenswrapper[5127]: I0201 06:51:14.986196 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.057575 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84qgc\" (UniqueName: \"kubernetes.io/projected/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-kube-api-access-84qgc\") pod \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.057683 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-utilities\") pod \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.057808 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-catalog-content\") pod \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\" (UID: \"1c2c6b95-9c35-4fa1-be58-4e825fd86e97\") " Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.059647 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-utilities" (OuterVolumeSpecName: "utilities") pod "1c2c6b95-9c35-4fa1-be58-4e825fd86e97" (UID: "1c2c6b95-9c35-4fa1-be58-4e825fd86e97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.073979 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-kube-api-access-84qgc" (OuterVolumeSpecName: "kube-api-access-84qgc") pod "1c2c6b95-9c35-4fa1-be58-4e825fd86e97" (UID: "1c2c6b95-9c35-4fa1-be58-4e825fd86e97"). InnerVolumeSpecName "kube-api-access-84qgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.159616 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84qgc\" (UniqueName: \"kubernetes.io/projected/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-kube-api-access-84qgc\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.159664 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.186865 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c2c6b95-9c35-4fa1-be58-4e825fd86e97" (UID: "1c2c6b95-9c35-4fa1-be58-4e825fd86e97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.260965 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c2c6b95-9c35-4fa1-be58-4e825fd86e97-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.381874 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerID="c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28" exitCode=0 Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.381949 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5p2j" event={"ID":"1c2c6b95-9c35-4fa1-be58-4e825fd86e97","Type":"ContainerDied","Data":"c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28"} Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.382004 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l5p2j" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.382040 5127 scope.go:117] "RemoveContainer" containerID="c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.382020 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l5p2j" event={"ID":"1c2c6b95-9c35-4fa1-be58-4e825fd86e97","Type":"ContainerDied","Data":"9a08bd0d3a7dd927bfb3d7f09063299709fc4c50a4723a2245b2d98516a94e00"} Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.415856 5127 scope.go:117] "RemoveContainer" containerID="85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.428937 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l5p2j"] Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.432843 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l5p2j"] Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.441398 5127 scope.go:117] "RemoveContainer" containerID="0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.460366 5127 scope.go:117] "RemoveContainer" containerID="c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28" Feb 01 06:51:15 crc kubenswrapper[5127]: E0201 06:51:15.460917 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28\": container with ID starting with c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28 not found: ID does not exist" containerID="c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.460977 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28"} err="failed to get container status \"c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28\": rpc error: code = NotFound desc = could not find container \"c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28\": container with ID starting with c100bf2064939f8e037a94b18c139c78b4d0bdb790f592059a68c8dc19543e28 not found: ID does not exist" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.461017 5127 scope.go:117] "RemoveContainer" containerID="85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3" Feb 01 06:51:15 crc kubenswrapper[5127]: E0201 06:51:15.461452 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3\": container with ID starting with 85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3 not found: ID does not exist" containerID="85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.461518 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3"} err="failed to get container status \"85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3\": rpc error: code = NotFound desc = could not find container \"85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3\": container with ID starting with 85e4dd8d149ca1eca2302e1361ade686a8bdee122a55ac460cfc54dc472431e3 not found: ID does not exist" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.461553 5127 scope.go:117] "RemoveContainer" containerID="0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d" Feb 01 06:51:15 crc kubenswrapper[5127]: E0201 06:51:15.461931 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d\": container with ID starting with 0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d not found: ID does not exist" containerID="0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d" Feb 01 06:51:15 crc kubenswrapper[5127]: I0201 06:51:15.461966 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d"} err="failed to get container status \"0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d\": rpc error: code = NotFound desc = could not find container \"0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d\": container with ID starting with 0935ae81f65eb64b5abb13300c8d5a5cd3a312b82c40a95e9fc54516e4ed8b9d not found: ID does not exist" Feb 01 06:51:16 crc kubenswrapper[5127]: I0201 06:51:16.245154 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" path="/var/lib/kubelet/pods/1c2c6b95-9c35-4fa1-be58-4e825fd86e97/volumes" Feb 01 06:51:16 crc kubenswrapper[5127]: I0201 06:51:16.608807 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b8789c864-fbk6c"] Feb 01 06:51:16 crc kubenswrapper[5127]: I0201 06:51:16.609035 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" podUID="7de7ca86-36e7-478d-8bb8-1aba1a4058f1" containerName="controller-manager" containerID="cri-o://f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015" gracePeriod=30 Feb 01 06:51:16 crc kubenswrapper[5127]: I0201 06:51:16.698062 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96"] Feb 01 06:51:16 crc kubenswrapper[5127]: I0201 06:51:16.698299 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" podUID="60525c9b-844f-42f5-a287-486e391b6121" containerName="route-controller-manager" containerID="cri-o://ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1" gracePeriod=30 Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.194711 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.199094 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.285675 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq4zb\" (UniqueName: \"kubernetes.io/projected/60525c9b-844f-42f5-a287-486e391b6121-kube-api-access-qq4zb\") pod \"60525c9b-844f-42f5-a287-486e391b6121\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.285858 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-config\") pod \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.285941 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60525c9b-844f-42f5-a287-486e391b6121-serving-cert\") pod \"60525c9b-844f-42f5-a287-486e391b6121\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.286097 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-config\") pod \"60525c9b-844f-42f5-a287-486e391b6121\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.286137 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-proxy-ca-bundles\") pod \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.286284 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2crd\" (UniqueName: \"kubernetes.io/projected/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-kube-api-access-n2crd\") pod \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.286317 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-serving-cert\") pod \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.286387 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-client-ca\") pod \"60525c9b-844f-42f5-a287-486e391b6121\" (UID: \"60525c9b-844f-42f5-a287-486e391b6121\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.286459 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-client-ca\") pod \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\" (UID: \"7de7ca86-36e7-478d-8bb8-1aba1a4058f1\") " Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.286956 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7de7ca86-36e7-478d-8bb8-1aba1a4058f1" (UID: "7de7ca86-36e7-478d-8bb8-1aba1a4058f1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.287000 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-config" (OuterVolumeSpecName: "config") pod "7de7ca86-36e7-478d-8bb8-1aba1a4058f1" (UID: "7de7ca86-36e7-478d-8bb8-1aba1a4058f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.287429 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-client-ca" (OuterVolumeSpecName: "client-ca") pod "60525c9b-844f-42f5-a287-486e391b6121" (UID: "60525c9b-844f-42f5-a287-486e391b6121"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.287462 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "7de7ca86-36e7-478d-8bb8-1aba1a4058f1" (UID: "7de7ca86-36e7-478d-8bb8-1aba1a4058f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.287458 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-config" (OuterVolumeSpecName: "config") pod "60525c9b-844f-42f5-a287-486e391b6121" (UID: "60525c9b-844f-42f5-a287-486e391b6121"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.290843 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-kube-api-access-n2crd" (OuterVolumeSpecName: "kube-api-access-n2crd") pod "7de7ca86-36e7-478d-8bb8-1aba1a4058f1" (UID: "7de7ca86-36e7-478d-8bb8-1aba1a4058f1"). InnerVolumeSpecName "kube-api-access-n2crd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.292808 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7de7ca86-36e7-478d-8bb8-1aba1a4058f1" (UID: "7de7ca86-36e7-478d-8bb8-1aba1a4058f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.293200 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60525c9b-844f-42f5-a287-486e391b6121-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "60525c9b-844f-42f5-a287-486e391b6121" (UID: "60525c9b-844f-42f5-a287-486e391b6121"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.294435 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60525c9b-844f-42f5-a287-486e391b6121-kube-api-access-qq4zb" (OuterVolumeSpecName: "kube-api-access-qq4zb") pod "60525c9b-844f-42f5-a287-486e391b6121" (UID: "60525c9b-844f-42f5-a287-486e391b6121"). InnerVolumeSpecName "kube-api-access-qq4zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388641 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388693 5127 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388712 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq4zb\" (UniqueName: \"kubernetes.io/projected/60525c9b-844f-42f5-a287-486e391b6121-kube-api-access-qq4zb\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388726 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60525c9b-844f-42f5-a287-486e391b6121-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388737 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388748 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60525c9b-844f-42f5-a287-486e391b6121-config\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388758 5127 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388771 5127 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.388787 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2crd\" (UniqueName: \"kubernetes.io/projected/7de7ca86-36e7-478d-8bb8-1aba1a4058f1-kube-api-access-n2crd\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.408110 5127 generic.go:334] "Generic (PLEG): container finished" podID="60525c9b-844f-42f5-a287-486e391b6121" containerID="ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1" exitCode=0 Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.408194 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" event={"ID":"60525c9b-844f-42f5-a287-486e391b6121","Type":"ContainerDied","Data":"ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1"} Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.408225 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.408243 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96" event={"ID":"60525c9b-844f-42f5-a287-486e391b6121","Type":"ContainerDied","Data":"ec6ebf71e2dd9d8dc3d38aa96cb1e267b54648fef3b9a962725ead1a6ffdb716"} Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.408264 5127 scope.go:117] "RemoveContainer" containerID="ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.412065 5127 generic.go:334] "Generic (PLEG): container finished" podID="7de7ca86-36e7-478d-8bb8-1aba1a4058f1" containerID="f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015" exitCode=0 Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.412104 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.412115 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" event={"ID":"7de7ca86-36e7-478d-8bb8-1aba1a4058f1","Type":"ContainerDied","Data":"f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015"} Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.412183 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8789c864-fbk6c" event={"ID":"7de7ca86-36e7-478d-8bb8-1aba1a4058f1","Type":"ContainerDied","Data":"0a93482a8174e53e53c605fffa5e48e723edec381767826bfcc15320fc39a337"} Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.446955 5127 scope.go:117] "RemoveContainer" containerID="ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1" Feb 01 06:51:17 crc kubenswrapper[5127]: E0201 06:51:17.447917 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1\": container with ID starting with ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1 not found: ID does not exist" containerID="ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.447952 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1"} err="failed to get container status \"ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1\": rpc error: code = NotFound desc = could not find container \"ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1\": container with ID starting with ef76d38038dcd9bc8d617882db33103d5bf634f321e742c783fda742c6efe1e1 not found: ID does not exist" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.448006 5127 scope.go:117] "RemoveContainer" containerID="f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.456957 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b8789c864-fbk6c"] Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.464461 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b8789c864-fbk6c"] Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.472027 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96"] Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.473898 5127 scope.go:117] "RemoveContainer" containerID="f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015" Feb 01 06:51:17 crc kubenswrapper[5127]: E0201 06:51:17.474754 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015\": container with ID starting with f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015 not found: ID does not exist" containerID="f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.474803 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015"} err="failed to get container status \"f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015\": rpc error: code = NotFound desc = could not find container \"f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015\": container with ID starting with f4a43da6331faf904be255d7d613911e34c6319fdc15283b1da0202992717015 not found: ID does not exist" Feb 01 06:51:17 crc kubenswrapper[5127]: I0201 06:51:17.476236 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dd66ff84d-phn96"] Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.002831 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx"] Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003239 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003268 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003291 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de7ca86-36e7-478d-8bb8-1aba1a4058f1" containerName="controller-manager" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003305 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de7ca86-36e7-478d-8bb8-1aba1a4058f1" containerName="controller-manager" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003324 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003338 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003354 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003368 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003389 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003402 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003423 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003436 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003454 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003467 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003489 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003504 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003523 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003536 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003550 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60525c9b-844f-42f5-a287-486e391b6121" containerName="route-controller-manager" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003564 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="60525c9b-844f-42f5-a287-486e391b6121" containerName="route-controller-manager" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003604 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003619 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003638 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003651 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003677 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003690 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="extract-content" Feb 01 06:51:18 crc kubenswrapper[5127]: E0201 06:51:18.003704 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003717 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="extract-utilities" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003892 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="40d7e137-a822-4ed9-b1cf-a123d53e4122" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003918 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de7ca86-36e7-478d-8bb8-1aba1a4058f1" containerName="controller-manager" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003937 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="60525c9b-844f-42f5-a287-486e391b6121" containerName="route-controller-manager" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003954 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0285e4e-ca44-40ee-aaad-4c2bef41ce24" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003969 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c2c6b95-9c35-4fa1-be58-4e825fd86e97" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.003986 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd3d1fb-13f1-452e-afa2-580c6d736be3" containerName="registry-server" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.004673 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.007503 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.007727 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.007897 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c89fb946d-gqnzn"] Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.008688 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.009173 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.009446 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.009510 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.011607 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.011888 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.014197 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx"] Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.014949 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.014949 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.015038 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.015110 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.017026 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.023617 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.074138 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c89fb946d-gqnzn"] Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.098100 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5429b9-24be-483d-8a39-e9db60a97ac2-config\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.098133 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvw72\" (UniqueName: \"kubernetes.io/projected/4f5429b9-24be-483d-8a39-e9db60a97ac2-kube-api-access-vvw72\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.100434 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-proxy-ca-bundles\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.100531 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9tpd\" (UniqueName: \"kubernetes.io/projected/e0dceb34-a081-4c95-85f9-15fe62133b66-kube-api-access-d9tpd\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.100571 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dceb34-a081-4c95-85f9-15fe62133b66-serving-cert\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.100613 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f5429b9-24be-483d-8a39-e9db60a97ac2-client-ca\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.100680 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f5429b9-24be-483d-8a39-e9db60a97ac2-serving-cert\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.100699 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-config\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.100737 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-client-ca\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.201597 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f5429b9-24be-483d-8a39-e9db60a97ac2-serving-cert\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.201644 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-config\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.201665 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-client-ca\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.201682 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5429b9-24be-483d-8a39-e9db60a97ac2-config\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.201697 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvw72\" (UniqueName: \"kubernetes.io/projected/4f5429b9-24be-483d-8a39-e9db60a97ac2-kube-api-access-vvw72\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.201717 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-proxy-ca-bundles\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.202842 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-client-ca\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.202945 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f5429b9-24be-483d-8a39-e9db60a97ac2-config\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.202979 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-config\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.203120 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9tpd\" (UniqueName: \"kubernetes.io/projected/e0dceb34-a081-4c95-85f9-15fe62133b66-kube-api-access-d9tpd\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.203409 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0dceb34-a081-4c95-85f9-15fe62133b66-proxy-ca-bundles\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.203455 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dceb34-a081-4c95-85f9-15fe62133b66-serving-cert\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.203906 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f5429b9-24be-483d-8a39-e9db60a97ac2-client-ca\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.204261 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f5429b9-24be-483d-8a39-e9db60a97ac2-client-ca\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.205604 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f5429b9-24be-483d-8a39-e9db60a97ac2-serving-cert\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.207046 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0dceb34-a081-4c95-85f9-15fe62133b66-serving-cert\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.218679 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvw72\" (UniqueName: \"kubernetes.io/projected/4f5429b9-24be-483d-8a39-e9db60a97ac2-kube-api-access-vvw72\") pod \"route-controller-manager-584f4669fb-wprdx\" (UID: \"4f5429b9-24be-483d-8a39-e9db60a97ac2\") " pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.220014 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9tpd\" (UniqueName: \"kubernetes.io/projected/e0dceb34-a081-4c95-85f9-15fe62133b66-kube-api-access-d9tpd\") pod \"controller-manager-c89fb946d-gqnzn\" (UID: \"e0dceb34-a081-4c95-85f9-15fe62133b66\") " pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.241822 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60525c9b-844f-42f5-a287-486e391b6121" path="/var/lib/kubelet/pods/60525c9b-844f-42f5-a287-486e391b6121/volumes" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.242486 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de7ca86-36e7-478d-8bb8-1aba1a4058f1" path="/var/lib/kubelet/pods/7de7ca86-36e7-478d-8bb8-1aba1a4058f1/volumes" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.387203 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.398452 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.666310 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c89fb946d-gqnzn"] Feb 01 06:51:18 crc kubenswrapper[5127]: W0201 06:51:18.673105 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0dceb34_a081_4c95_85f9_15fe62133b66.slice/crio-9cb67562d00f209222c8966fa86da51f791073f63b397c1c3b8707767291614f WatchSource:0}: Error finding container 9cb67562d00f209222c8966fa86da51f791073f63b397c1c3b8707767291614f: Status 404 returned error can't find the container with id 9cb67562d00f209222c8966fa86da51f791073f63b397c1c3b8707767291614f Feb 01 06:51:18 crc kubenswrapper[5127]: I0201 06:51:18.811138 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx"] Feb 01 06:51:18 crc kubenswrapper[5127]: W0201 06:51:18.828964 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f5429b9_24be_483d_8a39_e9db60a97ac2.slice/crio-af071f4ccc6a362c3a32745f42640db93d88402223229742854b34eb40bb726d WatchSource:0}: Error finding container af071f4ccc6a362c3a32745f42640db93d88402223229742854b34eb40bb726d: Status 404 returned error can't find the container with id af071f4ccc6a362c3a32745f42640db93d88402223229742854b34eb40bb726d Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.432443 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" event={"ID":"e0dceb34-a081-4c95-85f9-15fe62133b66","Type":"ContainerStarted","Data":"14e358b7169279cd19a248357b6cbb97ab9d3639fcec6d936fe6e622bb3c0c7f"} Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.432810 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" event={"ID":"e0dceb34-a081-4c95-85f9-15fe62133b66","Type":"ContainerStarted","Data":"9cb67562d00f209222c8966fa86da51f791073f63b397c1c3b8707767291614f"} Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.432849 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.434359 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" event={"ID":"4f5429b9-24be-483d-8a39-e9db60a97ac2","Type":"ContainerStarted","Data":"7b7c68065bd79536ee8e4133b0ed8d21ad15df88c0f7a3b0d57a56d839b3347a"} Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.434402 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" event={"ID":"4f5429b9-24be-483d-8a39-e9db60a97ac2","Type":"ContainerStarted","Data":"af071f4ccc6a362c3a32745f42640db93d88402223229742854b34eb40bb726d"} Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.434640 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.440706 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.441308 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.453926 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c89fb946d-gqnzn" podStartSLOduration=3.453901047 podStartE2EDuration="3.453901047s" podCreationTimestamp="2026-02-01 06:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:51:19.448331711 +0000 UTC m=+229.934234104" watchObservedRunningTime="2026-02-01 06:51:19.453901047 +0000 UTC m=+229.939803420" Feb 01 06:51:19 crc kubenswrapper[5127]: I0201 06:51:19.498911 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-584f4669fb-wprdx" podStartSLOduration=3.498895841 podStartE2EDuration="3.498895841s" podCreationTimestamp="2026-02-01 06:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:51:19.475071171 +0000 UTC m=+229.960973534" watchObservedRunningTime="2026-02-01 06:51:19.498895841 +0000 UTC m=+229.984798194" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.804378 5127 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.805531 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.806861 5127 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.807137 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b" gracePeriod=15 Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.807233 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919" gracePeriod=15 Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.807146 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3" gracePeriod=15 Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.807179 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322" gracePeriod=15 Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.807149 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471" gracePeriod=15 Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808489 5127 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:51:25 crc kubenswrapper[5127]: E0201 06:51:25.808728 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808766 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 06:51:25 crc kubenswrapper[5127]: E0201 06:51:25.808780 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808786 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:51:25 crc kubenswrapper[5127]: E0201 06:51:25.808801 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808809 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 06:51:25 crc kubenswrapper[5127]: E0201 06:51:25.808822 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808831 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 01 06:51:25 crc kubenswrapper[5127]: E0201 06:51:25.808843 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808849 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 06:51:25 crc kubenswrapper[5127]: E0201 06:51:25.808856 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808862 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808950 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808962 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808974 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808982 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.808989 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 06:51:25 crc kubenswrapper[5127]: E0201 06:51:25.809075 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.809082 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.809168 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.848823 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.920770 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.920831 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.920851 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.920968 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.920990 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.921008 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.921024 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:25 crc kubenswrapper[5127]: I0201 06:51:25.921054 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.021992 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022035 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022057 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022077 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022106 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022127 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022144 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022163 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022180 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022220 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022224 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022252 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022240 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022224 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022181 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.022234 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.150634 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:51:26 crc kubenswrapper[5127]: W0201 06:51:26.172903 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-a86d4f3c11e375e7774a0a44852f3057bf653dfe78dfb6b3defb3aec4762ce48 WatchSource:0}: Error finding container a86d4f3c11e375e7774a0a44852f3057bf653dfe78dfb6b3defb3aec4762ce48: Status 404 returned error can't find the container with id a86d4f3c11e375e7774a0a44852f3057bf653dfe78dfb6b3defb3aec4762ce48 Feb 01 06:51:26 crc kubenswrapper[5127]: E0201 06:51:26.178169 5127 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18900cc3395c953a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:51:26.177129786 +0000 UTC m=+236.663032189,LastTimestamp:2026-02-01 06:51:26.177129786 +0000 UTC m=+236.663032189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.481297 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.483284 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.483970 5127 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3" exitCode=0 Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.484003 5127 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919" exitCode=0 Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.484021 5127 scope.go:117] "RemoveContainer" containerID="b0c5fddebb82e99c86a35e0f8539567fe27df6c496e94b2aeca9ab79e0665818" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.484036 5127 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471" exitCode=0 Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.484047 5127 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322" exitCode=2 Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.487497 5127 generic.go:334] "Generic (PLEG): container finished" podID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" containerID="df735347dab844e9f8f5b6b48642004b0250e57e386d1b93034ec3046a74a6b2" exitCode=0 Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.487558 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cd937e44-b5d4-420f-a6d7-6347e1fb3a10","Type":"ContainerDied","Data":"df735347dab844e9f8f5b6b48642004b0250e57e386d1b93034ec3046a74a6b2"} Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.488172 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.488508 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.489999 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"a86d4f3c11e375e7774a0a44852f3057bf653dfe78dfb6b3defb3aec4762ce48"} Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.490888 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:26 crc kubenswrapper[5127]: I0201 06:51:26.491177 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:27 crc kubenswrapper[5127]: I0201 06:51:27.497360 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:51:27 crc kubenswrapper[5127]: I0201 06:51:27.500447 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"664cc13bb4b1d526e2baa725796c4301d5411f72280e39d37bd99715f0b19d39"} Feb 01 06:51:27 crc kubenswrapper[5127]: I0201 06:51:27.999193 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:27.999997 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.000242 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.189121 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.189815 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.190263 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.190483 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.190838 5127 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.191771 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kubelet-dir\") pod \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.191860 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kube-api-access\") pod \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.191886 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-var-lock\") pod \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\" (UID: \"cd937e44-b5d4-420f-a6d7-6347e1fb3a10\") " Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.192004 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cd937e44-b5d4-420f-a6d7-6347e1fb3a10" (UID: "cd937e44-b5d4-420f-a6d7-6347e1fb3a10"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.192048 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-var-lock" (OuterVolumeSpecName: "var-lock") pod "cd937e44-b5d4-420f-a6d7-6347e1fb3a10" (UID: "cd937e44-b5d4-420f-a6d7-6347e1fb3a10"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.192113 5127 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.192124 5127 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-var-lock\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.200861 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cd937e44-b5d4-420f-a6d7-6347e1fb3a10" (UID: "cd937e44-b5d4-420f-a6d7-6347e1fb3a10"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:28 crc kubenswrapper[5127]: E0201 06:51:28.227671 5127 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18900cc3395c953a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:51:26.177129786 +0000 UTC m=+236.663032189,LastTimestamp:2026-02-01 06:51:26.177129786 +0000 UTC m=+236.663032189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.292712 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.292798 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.292830 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.292851 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.292896 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.292983 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.293037 5127 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.293052 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd937e44-b5d4-420f-a6d7-6347e1fb3a10-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.293064 5127 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.293076 5127 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.508927 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cd937e44-b5d4-420f-a6d7-6347e1fb3a10","Type":"ContainerDied","Data":"fdfa2f0b50516c10c57769498e64f323e66f8c9e887875575cdb6d0cd7db93b6"} Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.508988 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdfa2f0b50516c10c57769498e64f323e66f8c9e887875575cdb6d0cd7db93b6" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.509107 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.512665 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.513980 5127 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b" exitCode=0 Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.514078 5127 scope.go:117] "RemoveContainer" containerID="cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.514189 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.515166 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.515756 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.516362 5127 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.517123 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.518266 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.519857 5127 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.540088 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.540709 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.540738 5127 scope.go:117] "RemoveContainer" containerID="1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.541112 5127 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.562984 5127 scope.go:117] "RemoveContainer" containerID="ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.580237 5127 scope.go:117] "RemoveContainer" containerID="b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.596661 5127 scope.go:117] "RemoveContainer" containerID="fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.619766 5127 scope.go:117] "RemoveContainer" containerID="a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.703107 5127 scope.go:117] "RemoveContainer" containerID="cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3" Feb 01 06:51:28 crc kubenswrapper[5127]: E0201 06:51:28.703999 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\": container with ID starting with cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3 not found: ID does not exist" containerID="cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.704052 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3"} err="failed to get container status \"cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\": rpc error: code = NotFound desc = could not find container \"cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3\": container with ID starting with cb3d3d3973253682fb7eb9fb95a2b5da210cea6bf67f622f0ee55ee168853bf3 not found: ID does not exist" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.704089 5127 scope.go:117] "RemoveContainer" containerID="1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919" Feb 01 06:51:28 crc kubenswrapper[5127]: E0201 06:51:28.704824 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\": container with ID starting with 1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919 not found: ID does not exist" containerID="1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.704868 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919"} err="failed to get container status \"1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\": rpc error: code = NotFound desc = could not find container \"1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919\": container with ID starting with 1dd8b75012188ed90a49f59cbac367c57fff6e598cf508a22d2b50ff148ea919 not found: ID does not exist" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.704903 5127 scope.go:117] "RemoveContainer" containerID="ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471" Feb 01 06:51:28 crc kubenswrapper[5127]: E0201 06:51:28.705352 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\": container with ID starting with ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471 not found: ID does not exist" containerID="ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.705381 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471"} err="failed to get container status \"ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\": rpc error: code = NotFound desc = could not find container \"ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471\": container with ID starting with ecb3e865b21df675f1430a3d6ea07903ccc37cd819ec1972c47be7ab815cb471 not found: ID does not exist" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.705404 5127 scope.go:117] "RemoveContainer" containerID="b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322" Feb 01 06:51:28 crc kubenswrapper[5127]: E0201 06:51:28.705850 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\": container with ID starting with b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322 not found: ID does not exist" containerID="b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.705889 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322"} err="failed to get container status \"b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\": rpc error: code = NotFound desc = could not find container \"b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322\": container with ID starting with b13435a0e242d23842defda9f88f06a590b6751405222e719e9160c976021322 not found: ID does not exist" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.705915 5127 scope.go:117] "RemoveContainer" containerID="fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b" Feb 01 06:51:28 crc kubenswrapper[5127]: E0201 06:51:28.706498 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\": container with ID starting with fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b not found: ID does not exist" containerID="fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.706532 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b"} err="failed to get container status \"fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\": rpc error: code = NotFound desc = could not find container \"fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b\": container with ID starting with fdcbdbacb9f348f85420adeec28cda713b9114107804156720ea95d5eb0dff1b not found: ID does not exist" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.706551 5127 scope.go:117] "RemoveContainer" containerID="a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8" Feb 01 06:51:28 crc kubenswrapper[5127]: E0201 06:51:28.707038 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\": container with ID starting with a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8 not found: ID does not exist" containerID="a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8" Feb 01 06:51:28 crc kubenswrapper[5127]: I0201 06:51:28.707107 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8"} err="failed to get container status \"a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\": rpc error: code = NotFound desc = could not find container \"a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8\": container with ID starting with a1622cddad4ea2c3439192fd114d59bc2bb0069e83482cb85052cdaafec43ee8 not found: ID does not exist" Feb 01 06:51:30 crc kubenswrapper[5127]: I0201 06:51:30.239937 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: I0201 06:51:30.240936 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: I0201 06:51:30.241640 5127 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: I0201 06:51:30.249482 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 01 06:51:30 crc kubenswrapper[5127]: E0201 06:51:30.624281 5127 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: E0201 06:51:30.624774 5127 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: E0201 06:51:30.625385 5127 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: E0201 06:51:30.626341 5127 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: E0201 06:51:30.627414 5127 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:30 crc kubenswrapper[5127]: I0201 06:51:30.627469 5127 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 01 06:51:30 crc kubenswrapper[5127]: E0201 06:51:30.627956 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Feb 01 06:51:30 crc kubenswrapper[5127]: E0201 06:51:30.828624 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Feb 01 06:51:31 crc kubenswrapper[5127]: E0201 06:51:31.229502 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Feb 01 06:51:32 crc kubenswrapper[5127]: E0201 06:51:32.031561 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Feb 01 06:51:33 crc kubenswrapper[5127]: E0201 06:51:33.632714 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Feb 01 06:51:36 crc kubenswrapper[5127]: I0201 06:51:36.833907 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" containerName="oauth-openshift" containerID="cri-o://4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17" gracePeriod=15 Feb 01 06:51:36 crc kubenswrapper[5127]: E0201 06:51:36.834501 5127 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="6.4s" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.455271 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.456314 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.456862 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.457327 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.576442 5127 generic.go:334] "Generic (PLEG): container finished" podID="ef8c34c0-f858-472e-8560-5e7806b32eab" containerID="4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17" exitCode=0 Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.576710 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" event={"ID":"ef8c34c0-f858-472e-8560-5e7806b32eab","Type":"ContainerDied","Data":"4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17"} Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.576815 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.577034 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" event={"ID":"ef8c34c0-f858-472e-8560-5e7806b32eab","Type":"ContainerDied","Data":"46d8a013987bc5c13505df9e8dcdcce4985b895d83011293af9c42c5f750e8c7"} Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.577079 5127 scope.go:117] "RemoveContainer" containerID="4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.578345 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.579072 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.579771 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.598605 5127 scope.go:117] "RemoveContainer" containerID="4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17" Feb 01 06:51:37 crc kubenswrapper[5127]: E0201 06:51:37.599795 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17\": container with ID starting with 4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17 not found: ID does not exist" containerID="4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.599838 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17"} err="failed to get container status \"4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17\": rpc error: code = NotFound desc = could not find container \"4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17\": container with ID starting with 4b87fd192fd185effb7fbbcc105247d45d6533c85fcb511aa9a084c6cf4b7a17 not found: ID does not exist" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638322 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-cliconfig\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638551 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-ocp-branding-template\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638630 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-idp-0-file-data\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638696 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-policies\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638774 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-session\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638813 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-login\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638865 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-dir\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.638971 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhcwz\" (UniqueName: \"kubernetes.io/projected/ef8c34c0-f858-472e-8560-5e7806b32eab-kube-api-access-hhcwz\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.639057 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-router-certs\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.639050 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.639165 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-provider-selection\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.639239 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-service-ca\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.639274 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-trusted-ca-bundle\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.639328 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-serving-cert\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.639896 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.640005 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.640077 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-error\") pod \"ef8c34c0-f858-472e-8560-5e7806b32eab\" (UID: \"ef8c34c0-f858-472e-8560-5e7806b32eab\") " Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.640295 5127 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.640316 5127 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ef8c34c0-f858-472e-8560-5e7806b32eab-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.640327 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.640389 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.640946 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.646700 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.647175 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.647560 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.647979 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.651183 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8c34c0-f858-472e-8560-5e7806b32eab-kube-api-access-hhcwz" (OuterVolumeSpecName: "kube-api-access-hhcwz") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "kube-api-access-hhcwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.654880 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.655238 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.655571 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.655868 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ef8c34c0-f858-472e-8560-5e7806b32eab" (UID: "ef8c34c0-f858-472e-8560-5e7806b32eab"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.741948 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhcwz\" (UniqueName: \"kubernetes.io/projected/ef8c34c0-f858-472e-8560-5e7806b32eab-kube-api-access-hhcwz\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742001 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742027 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742051 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742070 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742088 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742106 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742123 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742141 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742159 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.742176 5127 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ef8c34c0-f858-472e-8560-5e7806b32eab-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.905698 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.906765 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:37 crc kubenswrapper[5127]: I0201 06:51:37.907187 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:38 crc kubenswrapper[5127]: E0201 06:51:38.228906 5127 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18900cc3395c953a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 06:51:26.177129786 +0000 UTC m=+236.663032189,LastTimestamp:2026-02-01 06:51:26.177129786 +0000 UTC m=+236.663032189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.589378 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.589463 5127 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6" exitCode=1 Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.589525 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6"} Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.590966 5127 scope.go:117] "RemoveContainer" containerID="695c2352ee4064b038db1b5cb7a97fdce72ea7f4be29bf3a286c1415048fd5b6" Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.591067 5127 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.591780 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.592288 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:38 crc kubenswrapper[5127]: I0201 06:51:38.592811 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.235632 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.237225 5127 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.237941 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.238245 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.238541 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.257775 5127 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.257814 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:39 crc kubenswrapper[5127]: E0201 06:51:39.258315 5127 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.259056 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:39 crc kubenswrapper[5127]: W0201 06:51:39.293059 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-7cb7d36b4aab68ab4b8eb2b05606a4521d364e215a87d7796cea4775a8372e75 WatchSource:0}: Error finding container 7cb7d36b4aab68ab4b8eb2b05606a4521d364e215a87d7796cea4775a8372e75: Status 404 returned error can't find the container with id 7cb7d36b4aab68ab4b8eb2b05606a4521d364e215a87d7796cea4775a8372e75 Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.600662 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0d91215543d6740d95a60e281f39b90c55cb067934676b2ed1befa3cba4d03f2"} Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.600713 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7cb7d36b4aab68ab4b8eb2b05606a4521d364e215a87d7796cea4775a8372e75"} Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.601094 5127 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.601118 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.601805 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.602201 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: E0201 06:51:39.602227 5127 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.602673 5127 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.603453 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.606136 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.606222 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c08a295cabf23a7ff60710448faccb30bbe28f915b3ddf14279d0f98b5a42e80"} Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.607300 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.607715 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.608191 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:39 crc kubenswrapper[5127]: I0201 06:51:39.609754 5127 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.244367 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.244863 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.245368 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.246037 5127 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.246657 5127 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.618473 5127 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="0d91215543d6740d95a60e281f39b90c55cb067934676b2ed1befa3cba4d03f2" exitCode=0 Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.618625 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"0d91215543d6740d95a60e281f39b90c55cb067934676b2ed1befa3cba4d03f2"} Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.619105 5127 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.619168 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.619503 5127 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: E0201 06:51:40.619678 5127 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.620687 5127 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.621188 5127 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.621532 5127 status_manager.go:851] "Failed to get status for pod" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" pod="openshift-authentication/oauth-openshift-558db77b4-kz642" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-kz642\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:40 crc kubenswrapper[5127]: I0201 06:51:40.621976 5127 status_manager.go:851] "Failed to get status for pod" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Feb 01 06:51:41 crc kubenswrapper[5127]: I0201 06:51:41.389279 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:51:41 crc kubenswrapper[5127]: I0201 06:51:41.631560 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cf28df136cad66e739665bb846b3c71f2f34e2f742de913942437e407828cd57"} Feb 01 06:51:41 crc kubenswrapper[5127]: I0201 06:51:41.631622 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf1685372673cd3938d1ff265190ce058511405a5b2c12cb7f8cbae1b6560716"} Feb 01 06:51:42 crc kubenswrapper[5127]: I0201 06:51:42.639835 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8bac15a280f6a7a50de58f97766252d263a50a1477a11706251d41c98c526dcf"} Feb 01 06:51:42 crc kubenswrapper[5127]: I0201 06:51:42.640170 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dc4da3e7da3bb64c31680d3ab80de63f4327742606899734e74d064d499a81df"} Feb 01 06:51:42 crc kubenswrapper[5127]: I0201 06:51:42.640183 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"35c482b99757dfee7027225f801376990ccaced89ce0d19d18f5a679ff04ebac"} Feb 01 06:51:42 crc kubenswrapper[5127]: I0201 06:51:42.640459 5127 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:42 crc kubenswrapper[5127]: I0201 06:51:42.640477 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:42 crc kubenswrapper[5127]: I0201 06:51:42.640661 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:44 crc kubenswrapper[5127]: I0201 06:51:44.260145 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:44 crc kubenswrapper[5127]: I0201 06:51:44.260198 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:44 crc kubenswrapper[5127]: I0201 06:51:44.267203 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:47 crc kubenswrapper[5127]: I0201 06:51:47.539802 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:51:47 crc kubenswrapper[5127]: I0201 06:51:47.552441 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:51:47 crc kubenswrapper[5127]: I0201 06:51:47.654148 5127 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:47 crc kubenswrapper[5127]: I0201 06:51:47.670327 5127 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:47 crc kubenswrapper[5127]: I0201 06:51:47.670355 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:47 crc kubenswrapper[5127]: I0201 06:51:47.674033 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:51:47 crc kubenswrapper[5127]: I0201 06:51:47.701006 5127 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9440397-6d96-47fd-bc95-3e951b8188b1" Feb 01 06:51:48 crc kubenswrapper[5127]: I0201 06:51:48.683732 5127 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:48 crc kubenswrapper[5127]: I0201 06:51:48.683790 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:51:48 crc kubenswrapper[5127]: I0201 06:51:48.688422 5127 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d9440397-6d96-47fd-bc95-3e951b8188b1" Feb 01 06:51:51 crc kubenswrapper[5127]: I0201 06:51:51.393117 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 06:51:57 crc kubenswrapper[5127]: I0201 06:51:57.316188 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 01 06:51:57 crc kubenswrapper[5127]: I0201 06:51:57.452848 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 01 06:51:57 crc kubenswrapper[5127]: I0201 06:51:57.753064 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 01 06:51:57 crc kubenswrapper[5127]: I0201 06:51:57.768424 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 01 06:51:57 crc kubenswrapper[5127]: I0201 06:51:57.907063 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.114245 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.181151 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.640090 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.659516 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.688644 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.817050 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.928461 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 01 06:51:58 crc kubenswrapper[5127]: I0201 06:51:58.992834 5127 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 01 06:51:59 crc kubenswrapper[5127]: I0201 06:51:59.033984 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 01 06:51:59 crc kubenswrapper[5127]: I0201 06:51:59.224946 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 01 06:51:59 crc kubenswrapper[5127]: I0201 06:51:59.227247 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 01 06:51:59 crc kubenswrapper[5127]: I0201 06:51:59.577628 5127 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 01 06:51:59 crc kubenswrapper[5127]: I0201 06:51:59.644241 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 01 06:51:59 crc kubenswrapper[5127]: I0201 06:51:59.839726 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 01 06:51:59 crc kubenswrapper[5127]: I0201 06:51:59.886040 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 01 06:52:00 crc kubenswrapper[5127]: I0201 06:52:00.086347 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 01 06:52:00 crc kubenswrapper[5127]: I0201 06:52:00.116864 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 01 06:52:00 crc kubenswrapper[5127]: I0201 06:52:00.403531 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 01 06:52:00 crc kubenswrapper[5127]: I0201 06:52:00.415238 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 01 06:52:00 crc kubenswrapper[5127]: I0201 06:52:00.652450 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 01 06:52:00 crc kubenswrapper[5127]: I0201 06:52:00.856055 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.054097 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.159613 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.224893 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.332329 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.391042 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.449301 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.643110 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.647399 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.672806 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 01 06:52:01 crc kubenswrapper[5127]: I0201 06:52:01.903537 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.027269 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.075572 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.210904 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.252614 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.259882 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.261920 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.274199 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.335991 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.389017 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.391104 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.478546 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.681649 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.730482 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.800733 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.878816 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.901147 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 01 06:52:02 crc kubenswrapper[5127]: I0201 06:52:02.975764 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.042937 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.070411 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.144557 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.199225 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.218712 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.224278 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.274281 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.281894 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.358768 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.363674 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.422313 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.530740 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.653718 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.663732 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.853246 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 01 06:52:03 crc kubenswrapper[5127]: I0201 06:52:03.939427 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.008113 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.123439 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.191171 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.223364 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.246155 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.346700 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.378022 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.416017 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.456773 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.468667 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.488232 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.617683 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.655537 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.866023 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.952296 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.970878 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 01 06:52:04 crc kubenswrapper[5127]: I0201 06:52:04.970998 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.003485 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.078514 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.115721 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.148285 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.150610 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.191158 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.438462 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.496033 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.510307 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.619436 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.694096 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.733336 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.760191 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.760963 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.784826 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.833100 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 01 06:52:05 crc kubenswrapper[5127]: I0201 06:52:05.873820 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.151478 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.180999 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.223102 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.354190 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.386957 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.473934 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.586214 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.596123 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.600961 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.666016 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.684961 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.732201 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 01 06:52:06 crc kubenswrapper[5127]: I0201 06:52:06.889574 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.014043 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.030465 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.053922 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.080165 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.090792 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.096828 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.135971 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.136388 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.137784 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.226858 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.286477 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.361887 5127 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.363066 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=42.363048912 podStartE2EDuration="42.363048912s" podCreationTimestamp="2026-02-01 06:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:51:47.67755196 +0000 UTC m=+258.163454333" watchObservedRunningTime="2026-02-01 06:52:07.363048912 +0000 UTC m=+277.848951275" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.367878 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-kz642","openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.368209 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7d568d465d-qdlsh","openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 06:52:07 crc kubenswrapper[5127]: E0201 06:52:07.369086 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" containerName="installer" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.369117 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" containerName="installer" Feb 01 06:52:07 crc kubenswrapper[5127]: E0201 06:52:07.369130 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" containerName="oauth-openshift" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.369137 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" containerName="oauth-openshift" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.369257 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd937e44-b5d4-420f-a6d7-6347e1fb3a10" containerName="installer" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.369278 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" containerName="oauth-openshift" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.369736 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.369838 5127 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.369874 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="de21fcd8-f19d-4564-984f-22a1ab77dd82" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.374214 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.374942 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.375164 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.375636 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.375764 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.375831 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.375771 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.375990 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.376277 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.376337 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.376448 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.377414 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.379361 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.388711 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.389768 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.395643 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.395624251 podStartE2EDuration="20.395624251s" podCreationTimestamp="2026-02-01 06:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:52:07.392006259 +0000 UTC m=+277.877908632" watchObservedRunningTime="2026-02-01 06:52:07.395624251 +0000 UTC m=+277.881526614" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.399718 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.401759 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.435161 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.442409 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.451503 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458660 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f80b4c79-6dff-4289-b670-53143db5e8fc-audit-dir\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458702 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458728 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458800 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458847 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458878 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458928 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxsfl\" (UniqueName: \"kubernetes.io/projected/f80b4c79-6dff-4289-b670-53143db5e8fc-kube-api-access-fxsfl\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458957 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-audit-policies\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.458980 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.459016 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.459049 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.459087 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-session\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.459112 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.459136 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.472819 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.505340 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560069 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560120 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560146 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-session\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560165 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560184 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560220 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f80b4c79-6dff-4289-b670-53143db5e8fc-audit-dir\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560247 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560265 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560283 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560304 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560320 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560337 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxsfl\" (UniqueName: \"kubernetes.io/projected/f80b4c79-6dff-4289-b670-53143db5e8fc-kube-api-access-fxsfl\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560355 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-audit-policies\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.560372 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.561442 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f80b4c79-6dff-4289-b670-53143db5e8fc-audit-dir\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.561669 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-audit-policies\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.561838 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.562076 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.562673 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.565345 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.565982 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.566066 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.566223 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-session\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.566656 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.566770 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.567457 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.575178 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f80b4c79-6dff-4289-b670-53143db5e8fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.578979 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxsfl\" (UniqueName: \"kubernetes.io/projected/f80b4c79-6dff-4289-b670-53143db5e8fc-kube-api-access-fxsfl\") pod \"oauth-openshift-7d568d465d-qdlsh\" (UID: \"f80b4c79-6dff-4289-b670-53143db5e8fc\") " pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.580894 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.630156 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.707970 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.719296 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.852451 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.886381 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.916791 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.936221 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 01 06:52:07 crc kubenswrapper[5127]: I0201 06:52:07.951017 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.073973 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.089288 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.208241 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.233454 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.242161 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8c34c0-f858-472e-8560-5e7806b32eab" path="/var/lib/kubelet/pods/ef8c34c0-f858-472e-8560-5e7806b32eab/volumes" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.255345 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.268355 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.291868 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.367412 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.368292 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.379195 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.414245 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.439576 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.569239 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.623240 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.660019 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.662938 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.678402 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.686122 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.689542 5127 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.735874 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.763335 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.763469 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.798057 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.916538 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.921564 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 01 06:52:08 crc kubenswrapper[5127]: I0201 06:52:08.970045 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.013571 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.164369 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.330153 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.364764 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d568d465d-qdlsh"] Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.385041 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.437238 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.599228 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.640655 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.743151 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.814021 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" event={"ID":"f80b4c79-6dff-4289-b670-53143db5e8fc","Type":"ContainerStarted","Data":"d46891c8bafec6f12c36e41dbbe3f12385e4329fc18df814396f4a390c8896d3"} Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.814061 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" event={"ID":"f80b4c79-6dff-4289-b670-53143db5e8fc","Type":"ContainerStarted","Data":"d31b0085ee08722b45beeea7833784498d190f9419e5b5bdf27d7379bb7d8221"} Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.814295 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.845075 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.848888 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" podStartSLOduration=58.848864242 podStartE2EDuration="58.848864242s" podCreationTimestamp="2026-02-01 06:51:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:52:09.847105462 +0000 UTC m=+280.333007865" watchObservedRunningTime="2026-02-01 06:52:09.848864242 +0000 UTC m=+280.334766645" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.861358 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.862635 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.887991 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.927191 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 01 06:52:09 crc kubenswrapper[5127]: I0201 06:52:09.997788 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.003953 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.008444 5127 patch_prober.go:28] interesting pod/oauth-openshift-7d568d465d-qdlsh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.63:6443/healthz\": read tcp 10.217.0.2:40276->10.217.0.63:6443: read: connection reset by peer" start-of-body= Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.008521 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" podUID="f80b4c79-6dff-4289-b670-53143db5e8fc" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.63:6443/healthz\": read tcp 10.217.0.2:40276->10.217.0.63:6443: read: connection reset by peer" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.030066 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.043674 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.155185 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.165723 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.208641 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.296960 5127 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.297256 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://664cc13bb4b1d526e2baa725796c4301d5411f72280e39d37bd99715f0b19d39" gracePeriod=5 Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.372016 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.432802 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.465463 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.496993 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.555238 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.565303 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.623422 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.658675 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.711355 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.714063 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.819668 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7d568d465d-qdlsh_f80b4c79-6dff-4289-b670-53143db5e8fc/oauth-openshift/0.log" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.819706 5127 generic.go:334] "Generic (PLEG): container finished" podID="f80b4c79-6dff-4289-b670-53143db5e8fc" containerID="d46891c8bafec6f12c36e41dbbe3f12385e4329fc18df814396f4a390c8896d3" exitCode=255 Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.819731 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" event={"ID":"f80b4c79-6dff-4289-b670-53143db5e8fc","Type":"ContainerDied","Data":"d46891c8bafec6f12c36e41dbbe3f12385e4329fc18df814396f4a390c8896d3"} Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.820047 5127 scope.go:117] "RemoveContainer" containerID="d46891c8bafec6f12c36e41dbbe3f12385e4329fc18df814396f4a390c8896d3" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.841041 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.856735 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 01 06:52:10 crc kubenswrapper[5127]: I0201 06:52:10.985153 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.037480 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.116706 5127 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.203600 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.226943 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.393481 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.422062 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.515390 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.692739 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.705172 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.748258 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.781082 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.790294 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.828084 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-7d568d465d-qdlsh_f80b4c79-6dff-4289-b670-53143db5e8fc/oauth-openshift/0.log" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.828158 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" event={"ID":"f80b4c79-6dff-4289-b670-53143db5e8fc","Type":"ContainerStarted","Data":"6cafd6d3b69a38793dfd0cf0423072201766e963edfdf3d04d3b07b00f0e9151"} Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.828414 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.831933 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7d568d465d-qdlsh" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.935841 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 01 06:52:11 crc kubenswrapper[5127]: I0201 06:52:11.969181 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.444817 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.457672 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.495052 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.504684 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.510738 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.765190 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.773028 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.787697 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.821219 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.876275 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.926198 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.942274 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 01 06:52:12 crc kubenswrapper[5127]: I0201 06:52:12.942556 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 01 06:52:13 crc kubenswrapper[5127]: I0201 06:52:13.048902 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 06:52:13 crc kubenswrapper[5127]: I0201 06:52:13.079573 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 01 06:52:13 crc kubenswrapper[5127]: I0201 06:52:13.094005 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 01 06:52:13 crc kubenswrapper[5127]: I0201 06:52:13.334955 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 01 06:52:13 crc kubenswrapper[5127]: I0201 06:52:13.568234 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 01 06:52:14 crc kubenswrapper[5127]: I0201 06:52:14.091582 5127 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 01 06:52:14 crc kubenswrapper[5127]: I0201 06:52:14.375942 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.854497 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.854543 5127 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="664cc13bb4b1d526e2baa725796c4301d5411f72280e39d37bd99715f0b19d39" exitCode=137 Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.854599 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a86d4f3c11e375e7774a0a44852f3057bf653dfe78dfb6b3defb3aec4762ce48" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.864523 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.864753 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977279 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977552 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977474 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977741 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977845 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977950 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977749 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.977849 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.978195 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.978338 5127 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.978409 5127 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.978472 5127 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:52:15 crc kubenswrapper[5127]: I0201 06:52:15.987428 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.079871 5127 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.080178 5127 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.244181 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.244699 5127 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.264870 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.264920 5127 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4e325c40-e366-4d4a-abdb-6412f6db8876" Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.271498 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.271545 5127 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="4e325c40-e366-4d4a-abdb-6412f6db8876" Feb 01 06:52:16 crc kubenswrapper[5127]: I0201 06:52:16.859743 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 06:52:26 crc kubenswrapper[5127]: I0201 06:52:26.915838 5127 generic.go:334] "Generic (PLEG): container finished" podID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerID="8ceaa3d49815fb4a8ac9d12178e69e8f1dfb60126abfb336a8321bafe4fa3077" exitCode=0 Feb 01 06:52:26 crc kubenswrapper[5127]: I0201 06:52:26.915929 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" event={"ID":"84d1d573-370c-47e5-aab1-aee630e9aef0","Type":"ContainerDied","Data":"8ceaa3d49815fb4a8ac9d12178e69e8f1dfb60126abfb336a8321bafe4fa3077"} Feb 01 06:52:26 crc kubenswrapper[5127]: I0201 06:52:26.916807 5127 scope.go:117] "RemoveContainer" containerID="8ceaa3d49815fb4a8ac9d12178e69e8f1dfb60126abfb336a8321bafe4fa3077" Feb 01 06:52:27 crc kubenswrapper[5127]: I0201 06:52:27.923699 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" event={"ID":"84d1d573-370c-47e5-aab1-aee630e9aef0","Type":"ContainerStarted","Data":"2c8d6541f4bcb6694dcbafa64cbb81940ff1c504394f7fce8f77c18845cc339a"} Feb 01 06:52:27 crc kubenswrapper[5127]: I0201 06:52:27.924362 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:52:27 crc kubenswrapper[5127]: I0201 06:52:27.926722 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:52:30 crc kubenswrapper[5127]: I0201 06:52:30.008692 5127 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 01 06:52:45 crc kubenswrapper[5127]: I0201 06:52:45.025352 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 01 06:52:46 crc kubenswrapper[5127]: I0201 06:52:46.501920 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 01 06:53:36 crc kubenswrapper[5127]: I0201 06:53:36.741321 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:53:36 crc kubenswrapper[5127]: I0201 06:53:36.742082 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.736140 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67dj4"] Feb 01 06:53:39 crc kubenswrapper[5127]: E0201 06:53:39.736744 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.736760 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.736882 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.737283 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.757500 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67dj4"] Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.881702 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbcnd\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-kube-api-access-hbcnd\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.881743 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-registry-tls\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.881763 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6019b1a4-e294-4f91-97da-75dcb50f6823-trusted-ca\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.881956 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6019b1a4-e294-4f91-97da-75dcb50f6823-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.882051 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.882088 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-bound-sa-token\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.882121 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6019b1a4-e294-4f91-97da-75dcb50f6823-registry-certificates\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.882177 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6019b1a4-e294-4f91-97da-75dcb50f6823-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.913222 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.984404 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6019b1a4-e294-4f91-97da-75dcb50f6823-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.984537 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-bound-sa-token\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.984634 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6019b1a4-e294-4f91-97da-75dcb50f6823-registry-certificates\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.984703 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6019b1a4-e294-4f91-97da-75dcb50f6823-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.984785 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbcnd\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-kube-api-access-hbcnd\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.984819 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-registry-tls\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.984858 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6019b1a4-e294-4f91-97da-75dcb50f6823-trusted-ca\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.985186 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6019b1a4-e294-4f91-97da-75dcb50f6823-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.987051 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6019b1a4-e294-4f91-97da-75dcb50f6823-registry-certificates\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.987358 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6019b1a4-e294-4f91-97da-75dcb50f6823-trusted-ca\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.993813 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6019b1a4-e294-4f91-97da-75dcb50f6823-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:39 crc kubenswrapper[5127]: I0201 06:53:39.993951 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-registry-tls\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:40 crc kubenswrapper[5127]: I0201 06:53:40.012859 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-bound-sa-token\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:40 crc kubenswrapper[5127]: I0201 06:53:40.015876 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbcnd\" (UniqueName: \"kubernetes.io/projected/6019b1a4-e294-4f91-97da-75dcb50f6823-kube-api-access-hbcnd\") pod \"image-registry-66df7c8f76-67dj4\" (UID: \"6019b1a4-e294-4f91-97da-75dcb50f6823\") " pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:40 crc kubenswrapper[5127]: I0201 06:53:40.086513 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:40 crc kubenswrapper[5127]: I0201 06:53:40.379527 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67dj4"] Feb 01 06:53:40 crc kubenswrapper[5127]: I0201 06:53:40.407955 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" event={"ID":"6019b1a4-e294-4f91-97da-75dcb50f6823","Type":"ContainerStarted","Data":"4392a0c13227977b900a14e08cc9d6dfb973d7b7a8f354a13516960bb5f760e4"} Feb 01 06:53:41 crc kubenswrapper[5127]: I0201 06:53:41.416964 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" event={"ID":"6019b1a4-e294-4f91-97da-75dcb50f6823","Type":"ContainerStarted","Data":"cf63e871f9cf90dd23b530a75d373cf5b10c6666f67a8255efbd7ae60bf6cab7"} Feb 01 06:53:41 crc kubenswrapper[5127]: I0201 06:53:41.417433 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:53:41 crc kubenswrapper[5127]: I0201 06:53:41.445811 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" podStartSLOduration=2.445786594 podStartE2EDuration="2.445786594s" podCreationTimestamp="2026-02-01 06:53:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:53:41.442890362 +0000 UTC m=+371.928792765" watchObservedRunningTime="2026-02-01 06:53:41.445786594 +0000 UTC m=+371.931688987" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.093947 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdtzv"] Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.094692 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rdtzv" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="registry-server" containerID="cri-o://c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800" gracePeriod=30 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.100092 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v4qm"] Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.100392 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6v4qm" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="registry-server" containerID="cri-o://259a92deb2ca7af706d6b2ebbaaa42c7c955da0b8d665f348aa9ac9a6186e445" gracePeriod=30 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.105442 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zc4gp"] Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.105774 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" containerID="cri-o://2c8d6541f4bcb6694dcbafa64cbb81940ff1c504394f7fce8f77c18845cc339a" gracePeriod=30 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.130469 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxqsq"] Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.130811 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bxqsq" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="registry-server" containerID="cri-o://caffc3b590bfb73029801dc2def95a720ca706e5dc8253486ac705faff6fe8da" gracePeriod=30 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.140412 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjscz"] Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.140797 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jjscz" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="registry-server" containerID="cri-o://f966fd9dbf355d0000f9d2bf533f58d6715ff5f09f44884903e448bb56030f91" gracePeriod=30 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.146024 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9bpzr"] Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.146710 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.153781 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9bpzr"] Feb 01 06:53:59 crc kubenswrapper[5127]: E0201 06:53:59.178085 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800 is running failed: container process not found" containerID="c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:53:59 crc kubenswrapper[5127]: E0201 06:53:59.178506 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800 is running failed: container process not found" containerID="c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:53:59 crc kubenswrapper[5127]: E0201 06:53:59.178757 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800 is running failed: container process not found" containerID="c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 06:53:59 crc kubenswrapper[5127]: E0201 06:53:59.178811 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-rdtzv" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="registry-server" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.283422 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47dgl\" (UniqueName: \"kubernetes.io/projected/68e6e2cf-dc33-488e-8308-928b146d9aa3-kube-api-access-47dgl\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.283508 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68e6e2cf-dc33-488e-8308-928b146d9aa3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.283560 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68e6e2cf-dc33-488e-8308-928b146d9aa3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.384741 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47dgl\" (UniqueName: \"kubernetes.io/projected/68e6e2cf-dc33-488e-8308-928b146d9aa3-kube-api-access-47dgl\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.385203 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68e6e2cf-dc33-488e-8308-928b146d9aa3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.385373 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68e6e2cf-dc33-488e-8308-928b146d9aa3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.390538 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68e6e2cf-dc33-488e-8308-928b146d9aa3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.397701 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68e6e2cf-dc33-488e-8308-928b146d9aa3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.400514 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47dgl\" (UniqueName: \"kubernetes.io/projected/68e6e2cf-dc33-488e-8308-928b146d9aa3-kube-api-access-47dgl\") pod \"marketplace-operator-79b997595-9bpzr\" (UID: \"68e6e2cf-dc33-488e-8308-928b146d9aa3\") " pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.466557 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.552935 5127 generic.go:334] "Generic (PLEG): container finished" podID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerID="caffc3b590bfb73029801dc2def95a720ca706e5dc8253486ac705faff6fe8da" exitCode=0 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.553030 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxqsq" event={"ID":"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30","Type":"ContainerDied","Data":"caffc3b590bfb73029801dc2def95a720ca706e5dc8253486ac705faff6fe8da"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.553062 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bxqsq" event={"ID":"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30","Type":"ContainerDied","Data":"f8500056b2318a657bf1ad2f77579ec0f8b8032b4e8638adf474218bd90769c2"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.553075 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8500056b2318a657bf1ad2f77579ec0f8b8032b4e8638adf474218bd90769c2" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.556404 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdtzv" event={"ID":"3b3cb296-e043-482b-b6e0-50f0341eee73","Type":"ContainerDied","Data":"c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.556398 5127 generic.go:334] "Generic (PLEG): container finished" podID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerID="c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800" exitCode=0 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.556794 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdtzv" event={"ID":"3b3cb296-e043-482b-b6e0-50f0341eee73","Type":"ContainerDied","Data":"9a4196927b5a9003ba85bd2d46ddcae376fdbacbddc72472b131b734c270a43e"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.556844 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a4196927b5a9003ba85bd2d46ddcae376fdbacbddc72472b131b734c270a43e" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.561612 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" event={"ID":"84d1d573-370c-47e5-aab1-aee630e9aef0","Type":"ContainerDied","Data":"2c8d6541f4bcb6694dcbafa64cbb81940ff1c504394f7fce8f77c18845cc339a"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.561646 5127 scope.go:117] "RemoveContainer" containerID="8ceaa3d49815fb4a8ac9d12178e69e8f1dfb60126abfb336a8321bafe4fa3077" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.561569 5127 generic.go:334] "Generic (PLEG): container finished" podID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerID="2c8d6541f4bcb6694dcbafa64cbb81940ff1c504394f7fce8f77c18845cc339a" exitCode=0 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.561732 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" event={"ID":"84d1d573-370c-47e5-aab1-aee630e9aef0","Type":"ContainerDied","Data":"79438ca8d045e5413d268500e0cbbf6a39a98413ea29ec379579c4da130a211f"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.561765 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79438ca8d045e5413d268500e0cbbf6a39a98413ea29ec379579c4da130a211f" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.581200 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.582487 5127 generic.go:334] "Generic (PLEG): container finished" podID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerID="259a92deb2ca7af706d6b2ebbaaa42c7c955da0b8d665f348aa9ac9a6186e445" exitCode=0 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.582574 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v4qm" event={"ID":"ac3c627d-681a-4008-9bda-5e5f3af5aafd","Type":"ContainerDied","Data":"259a92deb2ca7af706d6b2ebbaaa42c7c955da0b8d665f348aa9ac9a6186e445"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.582637 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6v4qm" event={"ID":"ac3c627d-681a-4008-9bda-5e5f3af5aafd","Type":"ContainerDied","Data":"dc4805787ae40f37b2f33fc5cde7415976f0606675e6be59e2bd9453d120119f"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.582650 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc4805787ae40f37b2f33fc5cde7415976f0606675e6be59e2bd9453d120119f" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.586847 5127 generic.go:334] "Generic (PLEG): container finished" podID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerID="f966fd9dbf355d0000f9d2bf533f58d6715ff5f09f44884903e448bb56030f91" exitCode=0 Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.586922 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjscz" event={"ID":"02d7fc8d-87e8-455b-9f99-fde65167beea","Type":"ContainerDied","Data":"f966fd9dbf355d0000f9d2bf533f58d6715ff5f09f44884903e448bb56030f91"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.586962 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjscz" event={"ID":"02d7fc8d-87e8-455b-9f99-fde65167beea","Type":"ContainerDied","Data":"d7491f340ba64907c775029744610676e217121c6c795de4be2edd3d7000b5e0"} Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.586977 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7491f340ba64907c775029744610676e217121c6c795de4be2edd3d7000b5e0" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.588906 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.598009 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.607936 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.615252 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.689777 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-trusted-ca\") pod \"84d1d573-370c-47e5-aab1-aee630e9aef0\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.689833 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnj4r\" (UniqueName: \"kubernetes.io/projected/3b3cb296-e043-482b-b6e0-50f0341eee73-kube-api-access-gnj4r\") pod \"3b3cb296-e043-482b-b6e0-50f0341eee73\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.689869 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-catalog-content\") pod \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.689920 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-utilities\") pod \"3b3cb296-e043-482b-b6e0-50f0341eee73\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.689953 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bk6x\" (UniqueName: \"kubernetes.io/projected/02d7fc8d-87e8-455b-9f99-fde65167beea-kube-api-access-2bk6x\") pod \"02d7fc8d-87e8-455b-9f99-fde65167beea\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.689981 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvf7n\" (UniqueName: \"kubernetes.io/projected/ac3c627d-681a-4008-9bda-5e5f3af5aafd-kube-api-access-rvf7n\") pod \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690021 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-catalog-content\") pod \"3b3cb296-e043-482b-b6e0-50f0341eee73\" (UID: \"3b3cb296-e043-482b-b6e0-50f0341eee73\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690101 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-catalog-content\") pod \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690127 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-utilities\") pod \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\" (UID: \"ac3c627d-681a-4008-9bda-5e5f3af5aafd\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690148 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-utilities\") pod \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690180 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-catalog-content\") pod \"02d7fc8d-87e8-455b-9f99-fde65167beea\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690206 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-operator-metrics\") pod \"84d1d573-370c-47e5-aab1-aee630e9aef0\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690229 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-utilities\") pod \"02d7fc8d-87e8-455b-9f99-fde65167beea\" (UID: \"02d7fc8d-87e8-455b-9f99-fde65167beea\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690261 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfmxg\" (UniqueName: \"kubernetes.io/projected/84d1d573-370c-47e5-aab1-aee630e9aef0-kube-api-access-pfmxg\") pod \"84d1d573-370c-47e5-aab1-aee630e9aef0\" (UID: \"84d1d573-370c-47e5-aab1-aee630e9aef0\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.690289 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pth5\" (UniqueName: \"kubernetes.io/projected/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-kube-api-access-2pth5\") pod \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\" (UID: \"8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30\") " Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.694743 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-utilities" (OuterVolumeSpecName: "utilities") pod "ac3c627d-681a-4008-9bda-5e5f3af5aafd" (UID: "ac3c627d-681a-4008-9bda-5e5f3af5aafd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.695564 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-utilities" (OuterVolumeSpecName: "utilities") pod "8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" (UID: "8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.696323 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d7fc8d-87e8-455b-9f99-fde65167beea-kube-api-access-2bk6x" (OuterVolumeSpecName: "kube-api-access-2bk6x") pod "02d7fc8d-87e8-455b-9f99-fde65167beea" (UID: "02d7fc8d-87e8-455b-9f99-fde65167beea"). InnerVolumeSpecName "kube-api-access-2bk6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.696419 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-utilities" (OuterVolumeSpecName: "utilities") pod "02d7fc8d-87e8-455b-9f99-fde65167beea" (UID: "02d7fc8d-87e8-455b-9f99-fde65167beea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.697012 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3c627d-681a-4008-9bda-5e5f3af5aafd-kube-api-access-rvf7n" (OuterVolumeSpecName: "kube-api-access-rvf7n") pod "ac3c627d-681a-4008-9bda-5e5f3af5aafd" (UID: "ac3c627d-681a-4008-9bda-5e5f3af5aafd"). InnerVolumeSpecName "kube-api-access-rvf7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.697140 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b3cb296-e043-482b-b6e0-50f0341eee73-kube-api-access-gnj4r" (OuterVolumeSpecName: "kube-api-access-gnj4r") pod "3b3cb296-e043-482b-b6e0-50f0341eee73" (UID: "3b3cb296-e043-482b-b6e0-50f0341eee73"). InnerVolumeSpecName "kube-api-access-gnj4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.697503 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-utilities" (OuterVolumeSpecName: "utilities") pod "3b3cb296-e043-482b-b6e0-50f0341eee73" (UID: "3b3cb296-e043-482b-b6e0-50f0341eee73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.698629 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "84d1d573-370c-47e5-aab1-aee630e9aef0" (UID: "84d1d573-370c-47e5-aab1-aee630e9aef0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.702330 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "84d1d573-370c-47e5-aab1-aee630e9aef0" (UID: "84d1d573-370c-47e5-aab1-aee630e9aef0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.710031 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-kube-api-access-2pth5" (OuterVolumeSpecName: "kube-api-access-2pth5") pod "8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" (UID: "8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30"). InnerVolumeSpecName "kube-api-access-2pth5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.717262 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d1d573-370c-47e5-aab1-aee630e9aef0-kube-api-access-pfmxg" (OuterVolumeSpecName: "kube-api-access-pfmxg") pod "84d1d573-370c-47e5-aab1-aee630e9aef0" (UID: "84d1d573-370c-47e5-aab1-aee630e9aef0"). InnerVolumeSpecName "kube-api-access-pfmxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.725677 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" (UID: "8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.754501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b3cb296-e043-482b-b6e0-50f0341eee73" (UID: "3b3cb296-e043-482b-b6e0-50f0341eee73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.768424 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac3c627d-681a-4008-9bda-5e5f3af5aafd" (UID: "ac3c627d-681a-4008-9bda-5e5f3af5aafd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791671 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pth5\" (UniqueName: \"kubernetes.io/projected/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-kube-api-access-2pth5\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791706 5127 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791719 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnj4r\" (UniqueName: \"kubernetes.io/projected/3b3cb296-e043-482b-b6e0-50f0341eee73-kube-api-access-gnj4r\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791732 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791745 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791758 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bk6x\" (UniqueName: \"kubernetes.io/projected/02d7fc8d-87e8-455b-9f99-fde65167beea-kube-api-access-2bk6x\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791770 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvf7n\" (UniqueName: \"kubernetes.io/projected/ac3c627d-681a-4008-9bda-5e5f3af5aafd-kube-api-access-rvf7n\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791802 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b3cb296-e043-482b-b6e0-50f0341eee73-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791813 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791823 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c627d-681a-4008-9bda-5e5f3af5aafd-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791836 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791847 5127 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84d1d573-370c-47e5-aab1-aee630e9aef0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791859 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.791869 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfmxg\" (UniqueName: \"kubernetes.io/projected/84d1d573-370c-47e5-aab1-aee630e9aef0-kube-api-access-pfmxg\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.832009 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02d7fc8d-87e8-455b-9f99-fde65167beea" (UID: "02d7fc8d-87e8-455b-9f99-fde65167beea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.898021 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02d7fc8d-87e8-455b-9f99-fde65167beea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 06:53:59 crc kubenswrapper[5127]: I0201 06:53:59.928671 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9bpzr"] Feb 01 06:53:59 crc kubenswrapper[5127]: W0201 06:53:59.943401 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68e6e2cf_dc33_488e_8308_928b146d9aa3.slice/crio-9a8636883ae85aab4f2f6a7cbb67f98fbef75a7b8a44ae0618065307c08c1d88 WatchSource:0}: Error finding container 9a8636883ae85aab4f2f6a7cbb67f98fbef75a7b8a44ae0618065307c08c1d88: Status 404 returned error can't find the container with id 9a8636883ae85aab4f2f6a7cbb67f98fbef75a7b8a44ae0618065307c08c1d88 Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.091528 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-67dj4" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.174594 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8jnf"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.593833 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zc4gp" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.595956 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" event={"ID":"68e6e2cf-dc33-488e-8308-928b146d9aa3","Type":"ContainerStarted","Data":"8f8b823f2945166ac48c252935b891311f59b5908063779a67712a85118bd787"} Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.596005 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" event={"ID":"68e6e2cf-dc33-488e-8308-928b146d9aa3","Type":"ContainerStarted","Data":"9a8636883ae85aab4f2f6a7cbb67f98fbef75a7b8a44ae0618065307c08c1d88"} Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.596068 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bxqsq" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.596419 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdtzv" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.596532 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjscz" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.596622 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6v4qm" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.596952 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.602872 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.630411 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9bpzr" podStartSLOduration=1.630377153 podStartE2EDuration="1.630377153s" podCreationTimestamp="2026-02-01 06:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 06:54:00.625491736 +0000 UTC m=+391.111394099" watchObservedRunningTime="2026-02-01 06:54:00.630377153 +0000 UTC m=+391.116279516" Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.645689 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxqsq"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.650923 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bxqsq"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.685674 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6v4qm"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.705312 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6v4qm"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.708539 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdtzv"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.714290 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rdtzv"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.718451 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zc4gp"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.721309 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zc4gp"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.724430 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjscz"] Feb 01 06:54:00 crc kubenswrapper[5127]: I0201 06:54:00.727004 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jjscz"] Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.319920 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xqbdb"] Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320572 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320646 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320692 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320710 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320737 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320753 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320771 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320787 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320813 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320830 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320851 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320868 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320889 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320905 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320930 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320947 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.320971 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.320987 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.321012 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321027 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="extract-content" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.321047 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321064 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.321092 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321109 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.321136 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321153 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="extract-utilities" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321372 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321402 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321431 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321453 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321473 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321506 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" containerName="registry-server" Feb 01 06:54:01 crc kubenswrapper[5127]: E0201 06:54:01.321760 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.321783 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" containerName="marketplace-operator" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.326642 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.328551 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.329827 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqbdb"] Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.415155 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38271625-b7ec-4011-b426-b4ec1a5bb669-catalog-content\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.415201 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf44x\" (UniqueName: \"kubernetes.io/projected/38271625-b7ec-4011-b426-b4ec1a5bb669-kube-api-access-tf44x\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.415247 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38271625-b7ec-4011-b426-b4ec1a5bb669-utilities\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.510372 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2tgc4"] Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.512813 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.515744 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.516702 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38271625-b7ec-4011-b426-b4ec1a5bb669-utilities\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.516802 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38271625-b7ec-4011-b426-b4ec1a5bb669-catalog-content\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.516836 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf44x\" (UniqueName: \"kubernetes.io/projected/38271625-b7ec-4011-b426-b4ec1a5bb669-kube-api-access-tf44x\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.517417 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38271625-b7ec-4011-b426-b4ec1a5bb669-utilities\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.517445 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38271625-b7ec-4011-b426-b4ec1a5bb669-catalog-content\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.524911 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tgc4"] Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.557364 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf44x\" (UniqueName: \"kubernetes.io/projected/38271625-b7ec-4011-b426-b4ec1a5bb669-kube-api-access-tf44x\") pod \"redhat-marketplace-xqbdb\" (UID: \"38271625-b7ec-4011-b426-b4ec1a5bb669\") " pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.618434 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjzl\" (UniqueName: \"kubernetes.io/projected/2a50674b-ac62-4f1e-9be7-fe427860937e-kube-api-access-cdjzl\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.618505 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a50674b-ac62-4f1e-9be7-fe427860937e-utilities\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.618566 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a50674b-ac62-4f1e-9be7-fe427860937e-catalog-content\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.644707 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.720090 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjzl\" (UniqueName: \"kubernetes.io/projected/2a50674b-ac62-4f1e-9be7-fe427860937e-kube-api-access-cdjzl\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.720158 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a50674b-ac62-4f1e-9be7-fe427860937e-utilities\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.720184 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a50674b-ac62-4f1e-9be7-fe427860937e-catalog-content\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.721068 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a50674b-ac62-4f1e-9be7-fe427860937e-utilities\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.721370 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a50674b-ac62-4f1e-9be7-fe427860937e-catalog-content\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.740399 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjzl\" (UniqueName: \"kubernetes.io/projected/2a50674b-ac62-4f1e-9be7-fe427860937e-kube-api-access-cdjzl\") pod \"redhat-operators-2tgc4\" (UID: \"2a50674b-ac62-4f1e-9be7-fe427860937e\") " pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.838857 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:01 crc kubenswrapper[5127]: I0201 06:54:01.840242 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqbdb"] Feb 01 06:54:01 crc kubenswrapper[5127]: W0201 06:54:01.847052 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38271625_b7ec_4011_b426_b4ec1a5bb669.slice/crio-22665bd75f82472f65c8d517cb4aa5e8121d19482b1f2c5011b1b5b3ea1ff436 WatchSource:0}: Error finding container 22665bd75f82472f65c8d517cb4aa5e8121d19482b1f2c5011b1b5b3ea1ff436: Status 404 returned error can't find the container with id 22665bd75f82472f65c8d517cb4aa5e8121d19482b1f2c5011b1b5b3ea1ff436 Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.245267 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d7fc8d-87e8-455b-9f99-fde65167beea" path="/var/lib/kubelet/pods/02d7fc8d-87e8-455b-9f99-fde65167beea/volumes" Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.247135 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b3cb296-e043-482b-b6e0-50f0341eee73" path="/var/lib/kubelet/pods/3b3cb296-e043-482b-b6e0-50f0341eee73/volumes" Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.248686 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d1d573-370c-47e5-aab1-aee630e9aef0" path="/var/lib/kubelet/pods/84d1d573-370c-47e5-aab1-aee630e9aef0/volumes" Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.250572 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30" path="/var/lib/kubelet/pods/8ea38b4f-0889-4c3c-bdfc-7afa8a41eb30/volumes" Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.251893 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3c627d-681a-4008-9bda-5e5f3af5aafd" path="/var/lib/kubelet/pods/ac3c627d-681a-4008-9bda-5e5f3af5aafd/volumes" Feb 01 06:54:02 crc kubenswrapper[5127]: W0201 06:54:02.268739 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a50674b_ac62_4f1e_9be7_fe427860937e.slice/crio-9dbcd841492ea512d32ba4c72adae87b386c769870a3a394c45a00f1983c509f WatchSource:0}: Error finding container 9dbcd841492ea512d32ba4c72adae87b386c769870a3a394c45a00f1983c509f: Status 404 returned error can't find the container with id 9dbcd841492ea512d32ba4c72adae87b386c769870a3a394c45a00f1983c509f Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.271045 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2tgc4"] Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.608976 5127 generic.go:334] "Generic (PLEG): container finished" podID="38271625-b7ec-4011-b426-b4ec1a5bb669" containerID="9522872bc04dde85782acb7904d9494d4ca8f90a1e710bba3f5d4bf11d72f743" exitCode=0 Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.609081 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqbdb" event={"ID":"38271625-b7ec-4011-b426-b4ec1a5bb669","Type":"ContainerDied","Data":"9522872bc04dde85782acb7904d9494d4ca8f90a1e710bba3f5d4bf11d72f743"} Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.609122 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqbdb" event={"ID":"38271625-b7ec-4011-b426-b4ec1a5bb669","Type":"ContainerStarted","Data":"22665bd75f82472f65c8d517cb4aa5e8121d19482b1f2c5011b1b5b3ea1ff436"} Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.615052 5127 generic.go:334] "Generic (PLEG): container finished" podID="2a50674b-ac62-4f1e-9be7-fe427860937e" containerID="f2b51b552ced220a268df4168e790ba282ec1816bffb1c7cb73ddb475b873798" exitCode=0 Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.616248 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tgc4" event={"ID":"2a50674b-ac62-4f1e-9be7-fe427860937e","Type":"ContainerDied","Data":"f2b51b552ced220a268df4168e790ba282ec1816bffb1c7cb73ddb475b873798"} Feb 01 06:54:02 crc kubenswrapper[5127]: I0201 06:54:02.616299 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tgc4" event={"ID":"2a50674b-ac62-4f1e-9be7-fe427860937e","Type":"ContainerStarted","Data":"9dbcd841492ea512d32ba4c72adae87b386c769870a3a394c45a00f1983c509f"} Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.622690 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqbdb" event={"ID":"38271625-b7ec-4011-b426-b4ec1a5bb669","Type":"ContainerStarted","Data":"d5b8fe224b95d1674b009b73fbef3be7392b3187cdbe08244dd4331f50606b3b"} Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.625346 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tgc4" event={"ID":"2a50674b-ac62-4f1e-9be7-fe427860937e","Type":"ContainerStarted","Data":"c66ce5d56fe4a143470112979acbab3f0dc5aacc66e594ee1c3682bddc479e5d"} Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.707752 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qlzcf"] Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.709071 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.713206 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.720107 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qlzcf"] Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.852143 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgm6s\" (UniqueName: \"kubernetes.io/projected/596ead02-22f5-4b2a-9c63-41b24465a402-kube-api-access-fgm6s\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.852253 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-utilities\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.852420 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-catalog-content\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.907087 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5jszf"] Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.908017 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.911144 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.928983 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5jszf"] Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.954261 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgm6s\" (UniqueName: \"kubernetes.io/projected/596ead02-22f5-4b2a-9c63-41b24465a402-kube-api-access-fgm6s\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.954316 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-utilities\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.954375 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-catalog-content\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.954821 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-utilities\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.954886 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-catalog-content\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:03 crc kubenswrapper[5127]: I0201 06:54:03.973607 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgm6s\" (UniqueName: \"kubernetes.io/projected/596ead02-22f5-4b2a-9c63-41b24465a402-kube-api-access-fgm6s\") pod \"certified-operators-qlzcf\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.039089 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.055506 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxt6r\" (UniqueName: \"kubernetes.io/projected/7d430573-203e-43ff-abc3-a9e81827c1d6-kube-api-access-wxt6r\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.055558 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-catalog-content\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.055639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-utilities\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.157537 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-utilities\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.157941 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxt6r\" (UniqueName: \"kubernetes.io/projected/7d430573-203e-43ff-abc3-a9e81827c1d6-kube-api-access-wxt6r\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.157975 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-catalog-content\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.159018 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-catalog-content\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.159246 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-utilities\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.176957 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxt6r\" (UniqueName: \"kubernetes.io/projected/7d430573-203e-43ff-abc3-a9e81827c1d6-kube-api-access-wxt6r\") pod \"community-operators-5jszf\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.229860 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.259633 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qlzcf"] Feb 01 06:54:04 crc kubenswrapper[5127]: W0201 06:54:04.270344 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod596ead02_22f5_4b2a_9c63_41b24465a402.slice/crio-6c852d73580176959d15682d0b1baf04d2357772b6e6f6db728030eb8d84e0fd WatchSource:0}: Error finding container 6c852d73580176959d15682d0b1baf04d2357772b6e6f6db728030eb8d84e0fd: Status 404 returned error can't find the container with id 6c852d73580176959d15682d0b1baf04d2357772b6e6f6db728030eb8d84e0fd Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.436133 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5jszf"] Feb 01 06:54:04 crc kubenswrapper[5127]: W0201 06:54:04.445682 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d430573_203e_43ff_abc3_a9e81827c1d6.slice/crio-bf938a70a30ae5a1c8dc9a49eddaf047eaa7a83843a3f01e33fe42139e43cbb1 WatchSource:0}: Error finding container bf938a70a30ae5a1c8dc9a49eddaf047eaa7a83843a3f01e33fe42139e43cbb1: Status 404 returned error can't find the container with id bf938a70a30ae5a1c8dc9a49eddaf047eaa7a83843a3f01e33fe42139e43cbb1 Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.633635 5127 generic.go:334] "Generic (PLEG): container finished" podID="596ead02-22f5-4b2a-9c63-41b24465a402" containerID="7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec" exitCode=0 Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.633719 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlzcf" event={"ID":"596ead02-22f5-4b2a-9c63-41b24465a402","Type":"ContainerDied","Data":"7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec"} Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.633813 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlzcf" event={"ID":"596ead02-22f5-4b2a-9c63-41b24465a402","Type":"ContainerStarted","Data":"6c852d73580176959d15682d0b1baf04d2357772b6e6f6db728030eb8d84e0fd"} Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.636454 5127 generic.go:334] "Generic (PLEG): container finished" podID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerID="cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce" exitCode=0 Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.636528 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jszf" event={"ID":"7d430573-203e-43ff-abc3-a9e81827c1d6","Type":"ContainerDied","Data":"cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce"} Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.636551 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jszf" event={"ID":"7d430573-203e-43ff-abc3-a9e81827c1d6","Type":"ContainerStarted","Data":"bf938a70a30ae5a1c8dc9a49eddaf047eaa7a83843a3f01e33fe42139e43cbb1"} Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.640727 5127 generic.go:334] "Generic (PLEG): container finished" podID="38271625-b7ec-4011-b426-b4ec1a5bb669" containerID="d5b8fe224b95d1674b009b73fbef3be7392b3187cdbe08244dd4331f50606b3b" exitCode=0 Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.640803 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqbdb" event={"ID":"38271625-b7ec-4011-b426-b4ec1a5bb669","Type":"ContainerDied","Data":"d5b8fe224b95d1674b009b73fbef3be7392b3187cdbe08244dd4331f50606b3b"} Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.640838 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqbdb" event={"ID":"38271625-b7ec-4011-b426-b4ec1a5bb669","Type":"ContainerStarted","Data":"7fa20837fabf16ba23388a537c98ee780e727945b37d4cc266c20c6f126b74d3"} Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.642744 5127 generic.go:334] "Generic (PLEG): container finished" podID="2a50674b-ac62-4f1e-9be7-fe427860937e" containerID="c66ce5d56fe4a143470112979acbab3f0dc5aacc66e594ee1c3682bddc479e5d" exitCode=0 Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.642779 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tgc4" event={"ID":"2a50674b-ac62-4f1e-9be7-fe427860937e","Type":"ContainerDied","Data":"c66ce5d56fe4a143470112979acbab3f0dc5aacc66e594ee1c3682bddc479e5d"} Feb 01 06:54:04 crc kubenswrapper[5127]: I0201 06:54:04.703818 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xqbdb" podStartSLOduration=2.28075621 podStartE2EDuration="3.703801256s" podCreationTimestamp="2026-02-01 06:54:01 +0000 UTC" firstStartedPulling="2026-02-01 06:54:02.611144395 +0000 UTC m=+393.097046798" lastFinishedPulling="2026-02-01 06:54:04.034189481 +0000 UTC m=+394.520091844" observedRunningTime="2026-02-01 06:54:04.701761489 +0000 UTC m=+395.187663872" watchObservedRunningTime="2026-02-01 06:54:04.703801256 +0000 UTC m=+395.189703609" Feb 01 06:54:05 crc kubenswrapper[5127]: I0201 06:54:05.655886 5127 generic.go:334] "Generic (PLEG): container finished" podID="596ead02-22f5-4b2a-9c63-41b24465a402" containerID="d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2" exitCode=0 Feb 01 06:54:05 crc kubenswrapper[5127]: I0201 06:54:05.655947 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlzcf" event={"ID":"596ead02-22f5-4b2a-9c63-41b24465a402","Type":"ContainerDied","Data":"d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2"} Feb 01 06:54:05 crc kubenswrapper[5127]: I0201 06:54:05.661091 5127 generic.go:334] "Generic (PLEG): container finished" podID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerID="d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954" exitCode=0 Feb 01 06:54:05 crc kubenswrapper[5127]: I0201 06:54:05.661190 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jszf" event={"ID":"7d430573-203e-43ff-abc3-a9e81827c1d6","Type":"ContainerDied","Data":"d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954"} Feb 01 06:54:05 crc kubenswrapper[5127]: I0201 06:54:05.663979 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2tgc4" event={"ID":"2a50674b-ac62-4f1e-9be7-fe427860937e","Type":"ContainerStarted","Data":"6a3af76a1a4974c21082e2c28c062a9517a42ab9d4abc34ebff37d8e4d1157b2"} Feb 01 06:54:05 crc kubenswrapper[5127]: I0201 06:54:05.707444 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2tgc4" podStartSLOduration=2.302576987 podStartE2EDuration="4.707428121s" podCreationTimestamp="2026-02-01 06:54:01 +0000 UTC" firstStartedPulling="2026-02-01 06:54:02.617487215 +0000 UTC m=+393.103389578" lastFinishedPulling="2026-02-01 06:54:05.022338349 +0000 UTC m=+395.508240712" observedRunningTime="2026-02-01 06:54:05.703644584 +0000 UTC m=+396.189546947" watchObservedRunningTime="2026-02-01 06:54:05.707428121 +0000 UTC m=+396.193330484" Feb 01 06:54:06 crc kubenswrapper[5127]: I0201 06:54:06.671929 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlzcf" event={"ID":"596ead02-22f5-4b2a-9c63-41b24465a402","Type":"ContainerStarted","Data":"b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815"} Feb 01 06:54:06 crc kubenswrapper[5127]: I0201 06:54:06.673815 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jszf" event={"ID":"7d430573-203e-43ff-abc3-a9e81827c1d6","Type":"ContainerStarted","Data":"d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774"} Feb 01 06:54:06 crc kubenswrapper[5127]: I0201 06:54:06.693427 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qlzcf" podStartSLOduration=2.162511017 podStartE2EDuration="3.693411588s" podCreationTimestamp="2026-02-01 06:54:03 +0000 UTC" firstStartedPulling="2026-02-01 06:54:04.634932482 +0000 UTC m=+395.120834845" lastFinishedPulling="2026-02-01 06:54:06.165833053 +0000 UTC m=+396.651735416" observedRunningTime="2026-02-01 06:54:06.691051031 +0000 UTC m=+397.176953414" watchObservedRunningTime="2026-02-01 06:54:06.693411588 +0000 UTC m=+397.179313951" Feb 01 06:54:06 crc kubenswrapper[5127]: I0201 06:54:06.707520 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5jszf" podStartSLOduration=2.291058666 podStartE2EDuration="3.707493166s" podCreationTimestamp="2026-02-01 06:54:03 +0000 UTC" firstStartedPulling="2026-02-01 06:54:04.641029344 +0000 UTC m=+395.126931707" lastFinishedPulling="2026-02-01 06:54:06.057463844 +0000 UTC m=+396.543366207" observedRunningTime="2026-02-01 06:54:06.705575011 +0000 UTC m=+397.191477384" watchObservedRunningTime="2026-02-01 06:54:06.707493166 +0000 UTC m=+397.193395539" Feb 01 06:54:06 crc kubenswrapper[5127]: I0201 06:54:06.741117 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:54:06 crc kubenswrapper[5127]: I0201 06:54:06.741199 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:54:11 crc kubenswrapper[5127]: I0201 06:54:11.645021 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:11 crc kubenswrapper[5127]: I0201 06:54:11.645689 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:11 crc kubenswrapper[5127]: I0201 06:54:11.710136 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:11 crc kubenswrapper[5127]: I0201 06:54:11.749575 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xqbdb" Feb 01 06:54:11 crc kubenswrapper[5127]: I0201 06:54:11.839888 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:11 crc kubenswrapper[5127]: I0201 06:54:11.839927 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:11 crc kubenswrapper[5127]: I0201 06:54:11.881500 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:12 crc kubenswrapper[5127]: I0201 06:54:12.752367 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2tgc4" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.039958 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.040306 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.081742 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.230195 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.230725 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.277057 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.762290 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 06:54:14 crc kubenswrapper[5127]: I0201 06:54:14.766034 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5jszf" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.214248 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" podUID="f90c88e9-7849-4ec4-9df3-311426864686" containerName="registry" containerID="cri-o://d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df" gracePeriod=30 Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.512014 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641431 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641484 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-registry-certificates\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641521 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szrk4\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-kube-api-access-szrk4\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641552 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-bound-sa-token\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641614 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-registry-tls\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641679 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f90c88e9-7849-4ec4-9df3-311426864686-installation-pull-secrets\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641713 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-trusted-ca\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.641744 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f90c88e9-7849-4ec4-9df3-311426864686-ca-trust-extracted\") pod \"f90c88e9-7849-4ec4-9df3-311426864686\" (UID: \"f90c88e9-7849-4ec4-9df3-311426864686\") " Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.642301 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.644105 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.646901 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.647357 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f90c88e9-7849-4ec4-9df3-311426864686-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.647824 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.647960 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-kube-api-access-szrk4" (OuterVolumeSpecName: "kube-api-access-szrk4") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "kube-api-access-szrk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.654337 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.658899 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f90c88e9-7849-4ec4-9df3-311426864686-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f90c88e9-7849-4ec4-9df3-311426864686" (UID: "f90c88e9-7849-4ec4-9df3-311426864686"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.743348 5127 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.743646 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szrk4\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-kube-api-access-szrk4\") on node \"crc\" DevicePath \"\"" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.743656 5127 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.743665 5127 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f90c88e9-7849-4ec4-9df3-311426864686-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.743675 5127 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f90c88e9-7849-4ec4-9df3-311426864686-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.743684 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f90c88e9-7849-4ec4-9df3-311426864686-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.743692 5127 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f90c88e9-7849-4ec4-9df3-311426864686-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.782384 5127 generic.go:334] "Generic (PLEG): container finished" podID="f90c88e9-7849-4ec4-9df3-311426864686" containerID="d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df" exitCode=0 Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.782424 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" event={"ID":"f90c88e9-7849-4ec4-9df3-311426864686","Type":"ContainerDied","Data":"d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df"} Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.782442 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.782457 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-m8jnf" event={"ID":"f90c88e9-7849-4ec4-9df3-311426864686","Type":"ContainerDied","Data":"c79cc141021a8ce9da602a954994824863a0ff0481ada0d05ac8ef255ab01afd"} Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.782478 5127 scope.go:117] "RemoveContainer" containerID="d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.801494 5127 scope.go:117] "RemoveContainer" containerID="d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df" Feb 01 06:54:25 crc kubenswrapper[5127]: E0201 06:54:25.801980 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df\": container with ID starting with d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df not found: ID does not exist" containerID="d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.802039 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df"} err="failed to get container status \"d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df\": rpc error: code = NotFound desc = could not find container \"d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df\": container with ID starting with d6f4b515d157cf0f576c7c2718c951c2cd46ec059b5dd501d5d97436c0a6b7df not found: ID does not exist" Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.815348 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8jnf"] Feb 01 06:54:25 crc kubenswrapper[5127]: I0201 06:54:25.820569 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-m8jnf"] Feb 01 06:54:26 crc kubenswrapper[5127]: I0201 06:54:26.241502 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f90c88e9-7849-4ec4-9df3-311426864686" path="/var/lib/kubelet/pods/f90c88e9-7849-4ec4-9df3-311426864686/volumes" Feb 01 06:54:36 crc kubenswrapper[5127]: I0201 06:54:36.741058 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:54:36 crc kubenswrapper[5127]: I0201 06:54:36.741763 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:54:36 crc kubenswrapper[5127]: I0201 06:54:36.741814 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:54:36 crc kubenswrapper[5127]: I0201 06:54:36.742428 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44b5c7dc6fc8fd58e9f7997c01ccc01c8419c0923e9ed17a53aef39ef44af5ca"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:54:36 crc kubenswrapper[5127]: I0201 06:54:36.742524 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://44b5c7dc6fc8fd58e9f7997c01ccc01c8419c0923e9ed17a53aef39ef44af5ca" gracePeriod=600 Feb 01 06:54:37 crc kubenswrapper[5127]: I0201 06:54:37.855529 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="44b5c7dc6fc8fd58e9f7997c01ccc01c8419c0923e9ed17a53aef39ef44af5ca" exitCode=0 Feb 01 06:54:37 crc kubenswrapper[5127]: I0201 06:54:37.855637 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"44b5c7dc6fc8fd58e9f7997c01ccc01c8419c0923e9ed17a53aef39ef44af5ca"} Feb 01 06:54:37 crc kubenswrapper[5127]: I0201 06:54:37.855945 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"256bf9829fc049633822f2493668187c71ca130c9e57dfc280716cc083afada1"} Feb 01 06:54:37 crc kubenswrapper[5127]: I0201 06:54:37.855972 5127 scope.go:117] "RemoveContainer" containerID="6432a92e029266de680aa5dca4c66b3c873d7a410678649f0bcd406ae4873786" Feb 01 06:56:30 crc kubenswrapper[5127]: I0201 06:56:30.504837 5127 scope.go:117] "RemoveContainer" containerID="3ec61d217380b97a3274e891b06e3d12fe035aef8ba92fd3ef037a365acf64e0" Feb 01 06:56:30 crc kubenswrapper[5127]: I0201 06:56:30.543929 5127 scope.go:117] "RemoveContainer" containerID="59bfed2c64776ad46b29c69064635dcd61034fc1749587144db6149b6430b3d6" Feb 01 06:56:30 crc kubenswrapper[5127]: I0201 06:56:30.569732 5127 scope.go:117] "RemoveContainer" containerID="e5958004707a41d32faab7b32042485f82253776f5fc7d7ca5ec44cc07e41be8" Feb 01 06:56:30 crc kubenswrapper[5127]: I0201 06:56:30.591554 5127 scope.go:117] "RemoveContainer" containerID="d574a92e6ca46c70f085afe245abffec4e912391f5e64bb7f571ec48168fe2fd" Feb 01 06:57:06 crc kubenswrapper[5127]: I0201 06:57:06.741163 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:57:06 crc kubenswrapper[5127]: I0201 06:57:06.742735 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.625756 5127 scope.go:117] "RemoveContainer" containerID="664cc13bb4b1d526e2baa725796c4301d5411f72280e39d37bd99715f0b19d39" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.649908 5127 scope.go:117] "RemoveContainer" containerID="66da19ad99956146c19fd6a2610496c65cc643853d3c4eebade73e0a4286c772" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.693561 5127 scope.go:117] "RemoveContainer" containerID="fa6b9d11a0e4db31140967403504baeb16c01b4c9f05807327e292dfbee4ff19" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.715496 5127 scope.go:117] "RemoveContainer" containerID="50a1cf66fceb6cbde7e81d09d160e3f57e1b3f770e2c9f17e44d3c2e44530812" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.740306 5127 scope.go:117] "RemoveContainer" containerID="f966fd9dbf355d0000f9d2bf533f58d6715ff5f09f44884903e448bb56030f91" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.758766 5127 scope.go:117] "RemoveContainer" containerID="caffc3b590bfb73029801dc2def95a720ca706e5dc8253486ac705faff6fe8da" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.776120 5127 scope.go:117] "RemoveContainer" containerID="259a92deb2ca7af706d6b2ebbaaa42c7c955da0b8d665f348aa9ac9a6186e445" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.789023 5127 scope.go:117] "RemoveContainer" containerID="b39329e91e0e982d2c770b0ed534d4c48a0f35309010a9e106e5205fb9d3881a" Feb 01 06:57:30 crc kubenswrapper[5127]: I0201 06:57:30.802916 5127 scope.go:117] "RemoveContainer" containerID="c01ca8566c59ffb9431a8c5475c256ede92fab9406332da44080410515079800" Feb 01 06:57:36 crc kubenswrapper[5127]: I0201 06:57:36.740920 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:57:36 crc kubenswrapper[5127]: I0201 06:57:36.741276 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:58:06 crc kubenswrapper[5127]: I0201 06:58:06.741146 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 06:58:06 crc kubenswrapper[5127]: I0201 06:58:06.741769 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 06:58:06 crc kubenswrapper[5127]: I0201 06:58:06.741834 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 06:58:06 crc kubenswrapper[5127]: I0201 06:58:06.742685 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"256bf9829fc049633822f2493668187c71ca130c9e57dfc280716cc083afada1"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 06:58:06 crc kubenswrapper[5127]: I0201 06:58:06.742810 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://256bf9829fc049633822f2493668187c71ca130c9e57dfc280716cc083afada1" gracePeriod=600 Feb 01 06:58:07 crc kubenswrapper[5127]: I0201 06:58:07.337084 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="256bf9829fc049633822f2493668187c71ca130c9e57dfc280716cc083afada1" exitCode=0 Feb 01 06:58:07 crc kubenswrapper[5127]: I0201 06:58:07.337371 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"256bf9829fc049633822f2493668187c71ca130c9e57dfc280716cc083afada1"} Feb 01 06:58:07 crc kubenswrapper[5127]: I0201 06:58:07.337419 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"c0e3c3031601b19c3efb39d2ef7a904dac678201b3c13a21ac60151e7cab62c7"} Feb 01 06:58:07 crc kubenswrapper[5127]: I0201 06:58:07.337438 5127 scope.go:117] "RemoveContainer" containerID="44b5c7dc6fc8fd58e9f7997c01ccc01c8419c0923e9ed17a53aef39ef44af5ca" Feb 01 06:58:30 crc kubenswrapper[5127]: I0201 06:58:30.872244 5127 scope.go:117] "RemoveContainer" containerID="2c8d6541f4bcb6694dcbafa64cbb81940ff1c504394f7fce8f77c18845cc339a" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.181255 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662"] Feb 01 07:00:00 crc kubenswrapper[5127]: E0201 07:00:00.182349 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f90c88e9-7849-4ec4-9df3-311426864686" containerName="registry" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.182381 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f90c88e9-7849-4ec4-9df3-311426864686" containerName="registry" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.182688 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f90c88e9-7849-4ec4-9df3-311426864686" containerName="registry" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.183495 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.185886 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.186622 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.201239 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662"] Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.290391 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-config-volume\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.290525 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfm5z\" (UniqueName: \"kubernetes.io/projected/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-kube-api-access-zfm5z\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.290572 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-secret-volume\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.392414 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-config-volume\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.392489 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfm5z\" (UniqueName: \"kubernetes.io/projected/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-kube-api-access-zfm5z\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.392522 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-secret-volume\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.393429 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-config-volume\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.399536 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-secret-volume\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.412857 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfm5z\" (UniqueName: \"kubernetes.io/projected/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-kube-api-access-zfm5z\") pod \"collect-profiles-29498820-2b662\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.510956 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:00 crc kubenswrapper[5127]: I0201 07:00:00.751469 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662"] Feb 01 07:00:01 crc kubenswrapper[5127]: I0201 07:00:01.152598 5127 generic.go:334] "Generic (PLEG): container finished" podID="a20b3259-17e4-4994-85fb-efd8f4cb4aa5" containerID="db80a2538217fba326d03439d56791ae70b8dc5fe90d51f0f4aec2b075f86734" exitCode=0 Feb 01 07:00:01 crc kubenswrapper[5127]: I0201 07:00:01.152733 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" event={"ID":"a20b3259-17e4-4994-85fb-efd8f4cb4aa5","Type":"ContainerDied","Data":"db80a2538217fba326d03439d56791ae70b8dc5fe90d51f0f4aec2b075f86734"} Feb 01 07:00:01 crc kubenswrapper[5127]: I0201 07:00:01.152935 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" event={"ID":"a20b3259-17e4-4994-85fb-efd8f4cb4aa5","Type":"ContainerStarted","Data":"8e84e65bbf92bcb87a774d2e1066a2d81a923ab4668d6b3aaa42fa4712ad3e4a"} Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.432437 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.519522 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfm5z\" (UniqueName: \"kubernetes.io/projected/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-kube-api-access-zfm5z\") pod \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.519592 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-secret-volume\") pod \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.519646 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-config-volume\") pod \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\" (UID: \"a20b3259-17e4-4994-85fb-efd8f4cb4aa5\") " Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.520870 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-config-volume" (OuterVolumeSpecName: "config-volume") pod "a20b3259-17e4-4994-85fb-efd8f4cb4aa5" (UID: "a20b3259-17e4-4994-85fb-efd8f4cb4aa5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.521159 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.526444 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a20b3259-17e4-4994-85fb-efd8f4cb4aa5" (UID: "a20b3259-17e4-4994-85fb-efd8f4cb4aa5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.527449 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-kube-api-access-zfm5z" (OuterVolumeSpecName: "kube-api-access-zfm5z") pod "a20b3259-17e4-4994-85fb-efd8f4cb4aa5" (UID: "a20b3259-17e4-4994-85fb-efd8f4cb4aa5"). InnerVolumeSpecName "kube-api-access-zfm5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.622021 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfm5z\" (UniqueName: \"kubernetes.io/projected/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-kube-api-access-zfm5z\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:02 crc kubenswrapper[5127]: I0201 07:00:02.622072 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a20b3259-17e4-4994-85fb-efd8f4cb4aa5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:00:03 crc kubenswrapper[5127]: I0201 07:00:03.166049 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" event={"ID":"a20b3259-17e4-4994-85fb-efd8f4cb4aa5","Type":"ContainerDied","Data":"8e84e65bbf92bcb87a774d2e1066a2d81a923ab4668d6b3aaa42fa4712ad3e4a"} Feb 01 07:00:03 crc kubenswrapper[5127]: I0201 07:00:03.166098 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e84e65bbf92bcb87a774d2e1066a2d81a923ab4668d6b3aaa42fa4712ad3e4a" Feb 01 07:00:03 crc kubenswrapper[5127]: I0201 07:00:03.166164 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662" Feb 01 07:00:03 crc kubenswrapper[5127]: I0201 07:00:03.977738 5127 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 01 07:00:36 crc kubenswrapper[5127]: I0201 07:00:36.741305 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:00:36 crc kubenswrapper[5127]: I0201 07:00:36.741860 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:01:06 crc kubenswrapper[5127]: I0201 07:01:06.740644 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:01:06 crc kubenswrapper[5127]: I0201 07:01:06.741387 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.170869 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-njlcv"] Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.179202 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-controller" containerID="cri-o://e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.179343 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-node" containerID="cri-o://16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.179331 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.179396 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-acl-logging" containerID="cri-o://dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.179359 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="sbdb" containerID="cri-o://e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.179501 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="northd" containerID="cri-o://c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.182534 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="nbdb" containerID="cri-o://d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.217233 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" containerID="cri-o://d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" gracePeriod=30 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.500081 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/3.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.502996 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovn-acl-logging/0.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.503575 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovn-controller/0.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.504083 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552355 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovn-node-metrics-cert\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552417 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptwjj\" (UniqueName: \"kubernetes.io/projected/5034ec6a-7968-4592-a09b-a57a56ebdbc5-kube-api-access-ptwjj\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552433 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-log-socket\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552470 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-config\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552489 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-kubelet\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552504 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552534 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-script-lib\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552554 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-log-socket" (OuterVolumeSpecName: "log-socket") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552569 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-netd\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552604 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552629 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-openvswitch\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552658 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-node-log\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552679 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-netns\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552697 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-var-lib-openvswitch\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552714 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-systemd\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552731 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-systemd-units\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552749 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-ovn\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552765 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-slash\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552783 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-ovn-kubernetes\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552823 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-env-overrides\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552847 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-etc-openvswitch\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552867 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-bin\") pod \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\" (UID: \"5034ec6a-7968-4592-a09b-a57a56ebdbc5\") " Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552954 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.552982 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553011 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553079 5127 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553081 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553090 5127 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553099 5127 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553105 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-node-log" (OuterVolumeSpecName: "node-log") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553109 5127 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-log-socket\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553123 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553143 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553165 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553310 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-slash" (OuterVolumeSpecName: "host-slash") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553337 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553355 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553375 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553399 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553418 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.553559 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562047 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vp2ht"] Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562365 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562382 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562394 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562402 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562413 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a20b3259-17e4-4994-85fb-efd8f4cb4aa5" containerName="collect-profiles" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562420 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a20b3259-17e4-4994-85fb-efd8f4cb4aa5" containerName="collect-profiles" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562427 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562469 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562477 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="northd" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562485 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="northd" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562493 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="nbdb" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562499 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="nbdb" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562507 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562513 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562521 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562527 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562534 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kubecfg-setup" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562540 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kubecfg-setup" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562546 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-node" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562551 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-node" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562558 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-acl-logging" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562564 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-acl-logging" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562573 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="sbdb" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562626 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="sbdb" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562738 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562754 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="sbdb" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562766 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovn-acl-logging" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562776 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="nbdb" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562786 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562798 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562807 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562816 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562825 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562832 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="kube-rbac-proxy-node" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562860 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562871 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="northd" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562880 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a20b3259-17e4-4994-85fb-efd8f4cb4aa5" containerName="collect-profiles" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.562982 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.562994 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.563007 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.563014 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerName="ovnkube-controller" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.563308 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.563519 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5034ec6a-7968-4592-a09b-a57a56ebdbc5-kube-api-access-ptwjj" (OuterVolumeSpecName: "kube-api-access-ptwjj") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "kube-api-access-ptwjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.565519 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.579632 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5034ec6a-7968-4592-a09b-a57a56ebdbc5" (UID: "5034ec6a-7968-4592-a09b-a57a56ebdbc5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.654765 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-ovnkube-config\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.654870 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-var-lib-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655013 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655052 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-slash\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655083 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655114 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-cni-netd\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655173 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-ovn\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655230 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-ovnkube-script-lib\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655283 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-run-ovn-kubernetes\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655326 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-cni-bin\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655373 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-systemd\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655430 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-kubelet\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655480 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-run-netns\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655572 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-node-log\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655662 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f20317a-fc8b-4798-8b2b-9d763672282a-ovn-node-metrics-cert\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655698 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-env-overrides\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655729 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-log-socket\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655767 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-systemd-units\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655799 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7jj\" (UniqueName: \"kubernetes.io/projected/2f20317a-fc8b-4798-8b2b-9d763672282a-kube-api-access-7t7jj\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.655900 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-etc-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656011 5127 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656035 5127 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656056 5127 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656074 5127 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-node-log\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656090 5127 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656108 5127 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656124 5127 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656140 5127 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656157 5127 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656172 5127 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-slash\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656189 5127 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656210 5127 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5034ec6a-7968-4592-a09b-a57a56ebdbc5-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656226 5127 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656242 5127 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5034ec6a-7968-4592-a09b-a57a56ebdbc5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656259 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5034ec6a-7968-4592-a09b-a57a56ebdbc5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.656277 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptwjj\" (UniqueName: \"kubernetes.io/projected/5034ec6a-7968-4592-a09b-a57a56ebdbc5-kube-api-access-ptwjj\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.681537 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/2.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.682362 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/1.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.682440 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d959741-37e1-43e7-9ef6-5f33433f9447" containerID="cb8c2c5fe80a0d88b2acc8e86cef40eb17b13dca0c3ed6ce63bb1ca011ae4786" exitCode=2 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.682552 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerDied","Data":"cb8c2c5fe80a0d88b2acc8e86cef40eb17b13dca0c3ed6ce63bb1ca011ae4786"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.682659 5127 scope.go:117] "RemoveContainer" containerID="a23d4d047783c69551cde0d194e153a9244f36191d78bc779ae1f153fa055d07" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.685952 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovnkube-controller/3.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.687952 5127 scope.go:117] "RemoveContainer" containerID="cb8c2c5fe80a0d88b2acc8e86cef40eb17b13dca0c3ed6ce63bb1ca011ae4786" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.689862 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovn-acl-logging/0.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.690467 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-njlcv_5034ec6a-7968-4592-a09b-a57a56ebdbc5/ovn-controller/0.log" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691186 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" exitCode=0 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691228 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" exitCode=0 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691244 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" exitCode=0 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691263 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" exitCode=0 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691277 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" exitCode=0 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691291 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" exitCode=0 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691310 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" exitCode=143 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691325 5127 generic.go:334] "Generic (PLEG): container finished" podID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" containerID="e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" exitCode=143 Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691356 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691394 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691416 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691438 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691457 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691475 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691476 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691495 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691513 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691526 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691537 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691547 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691557 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691568 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691602 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691615 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691628 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691643 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691659 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691672 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691683 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691695 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691706 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691719 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691730 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691741 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691752 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691763 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691778 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691797 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691811 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691822 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691833 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691844 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691854 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691866 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691877 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691887 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691900 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691914 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-njlcv" event={"ID":"5034ec6a-7968-4592-a09b-a57a56ebdbc5","Type":"ContainerDied","Data":"0e594eee15da8e810adc438a0682ab8121b6fd3fc0e2b99d2df197f8c78f2d9b"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691929 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691942 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691953 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691964 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691974 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691985 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.691996 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.692008 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.692019 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.692029 5127 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.719407 5127 scope.go:117] "RemoveContainer" containerID="d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.747911 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.751565 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-njlcv"] Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.758721 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-run-netns\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.759021 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-node-log\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.759124 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-env-overrides\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.759217 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f20317a-fc8b-4798-8b2b-9d763672282a-ovn-node-metrics-cert\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.759309 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-log-socket\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.759403 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-systemd-units\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760706 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7jj\" (UniqueName: \"kubernetes.io/projected/2f20317a-fc8b-4798-8b2b-9d763672282a-kube-api-access-7t7jj\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760758 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-etc-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760801 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-ovnkube-config\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760828 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-var-lib-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760861 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760890 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760915 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-cni-netd\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760937 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-slash\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.760998 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-ovn\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761020 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-ovnkube-script-lib\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761044 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-run-ovn-kubernetes\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761069 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-cni-bin\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761093 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-systemd\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761166 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-kubelet\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761280 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-kubelet\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761337 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-run-netns\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.761811 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-node-log\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.762390 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-env-overrides\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.762842 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-cni-netd\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.762958 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-log-socket\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.763021 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-systemd-units\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.763531 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-etc-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.764535 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-ovnkube-config\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.764820 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-var-lib-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.764894 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.764952 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-openvswitch\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.765008 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-run-ovn-kubernetes\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.765060 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-slash\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.765108 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-ovn\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.767256 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-run-systemd\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.767265 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f20317a-fc8b-4798-8b2b-9d763672282a-host-cni-bin\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.768000 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f20317a-fc8b-4798-8b2b-9d763672282a-ovn-node-metrics-cert\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.769981 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f20317a-fc8b-4798-8b2b-9d763672282a-ovnkube-script-lib\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.770382 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-njlcv"] Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.782697 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7jj\" (UniqueName: \"kubernetes.io/projected/2f20317a-fc8b-4798-8b2b-9d763672282a-kube-api-access-7t7jj\") pod \"ovnkube-node-vp2ht\" (UID: \"2f20317a-fc8b-4798-8b2b-9d763672282a\") " pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.784139 5127 scope.go:117] "RemoveContainer" containerID="e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.815247 5127 scope.go:117] "RemoveContainer" containerID="d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.836364 5127 scope.go:117] "RemoveContainer" containerID="c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.855787 5127 scope.go:117] "RemoveContainer" containerID="ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.868772 5127 scope.go:117] "RemoveContainer" containerID="16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.881009 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.884448 5127 scope.go:117] "RemoveContainer" containerID="dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.912067 5127 scope.go:117] "RemoveContainer" containerID="e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.929889 5127 scope.go:117] "RemoveContainer" containerID="836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.950383 5127 scope.go:117] "RemoveContainer" containerID="d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.950862 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": container with ID starting with d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d not found: ID does not exist" containerID="d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.950905 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} err="failed to get container status \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": rpc error: code = NotFound desc = could not find container \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": container with ID starting with d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.950933 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.951291 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": container with ID starting with a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589 not found: ID does not exist" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.951357 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} err="failed to get container status \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": rpc error: code = NotFound desc = could not find container \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": container with ID starting with a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.951399 5127 scope.go:117] "RemoveContainer" containerID="e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.951743 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": container with ID starting with e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a not found: ID does not exist" containerID="e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.951779 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} err="failed to get container status \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": rpc error: code = NotFound desc = could not find container \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": container with ID starting with e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.951803 5127 scope.go:117] "RemoveContainer" containerID="d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.952042 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": container with ID starting with d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5 not found: ID does not exist" containerID="d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.952069 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} err="failed to get container status \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": rpc error: code = NotFound desc = could not find container \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": container with ID starting with d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.952086 5127 scope.go:117] "RemoveContainer" containerID="c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.952378 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": container with ID starting with c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc not found: ID does not exist" containerID="c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.952420 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} err="failed to get container status \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": rpc error: code = NotFound desc = could not find container \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": container with ID starting with c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.952443 5127 scope.go:117] "RemoveContainer" containerID="ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.953077 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": container with ID starting with ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76 not found: ID does not exist" containerID="ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.953107 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} err="failed to get container status \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": rpc error: code = NotFound desc = could not find container \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": container with ID starting with ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.953125 5127 scope.go:117] "RemoveContainer" containerID="16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.953415 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": container with ID starting with 16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc not found: ID does not exist" containerID="16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.953452 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} err="failed to get container status \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": rpc error: code = NotFound desc = could not find container \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": container with ID starting with 16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.953497 5127 scope.go:117] "RemoveContainer" containerID="dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.953772 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": container with ID starting with dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d not found: ID does not exist" containerID="dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.953805 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} err="failed to get container status \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": rpc error: code = NotFound desc = could not find container \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": container with ID starting with dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.953822 5127 scope.go:117] "RemoveContainer" containerID="e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.954078 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": container with ID starting with e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82 not found: ID does not exist" containerID="e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.954098 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} err="failed to get container status \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": rpc error: code = NotFound desc = could not find container \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": container with ID starting with e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.954111 5127 scope.go:117] "RemoveContainer" containerID="836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8" Feb 01 07:01:18 crc kubenswrapper[5127]: E0201 07:01:18.954796 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": container with ID starting with 836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8 not found: ID does not exist" containerID="836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.954854 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} err="failed to get container status \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": rpc error: code = NotFound desc = could not find container \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": container with ID starting with 836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.954872 5127 scope.go:117] "RemoveContainer" containerID="d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.955148 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} err="failed to get container status \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": rpc error: code = NotFound desc = could not find container \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": container with ID starting with d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.955182 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.955428 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} err="failed to get container status \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": rpc error: code = NotFound desc = could not find container \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": container with ID starting with a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.955457 5127 scope.go:117] "RemoveContainer" containerID="e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.955693 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} err="failed to get container status \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": rpc error: code = NotFound desc = could not find container \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": container with ID starting with e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.955717 5127 scope.go:117] "RemoveContainer" containerID="d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.955967 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} err="failed to get container status \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": rpc error: code = NotFound desc = could not find container \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": container with ID starting with d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.956043 5127 scope.go:117] "RemoveContainer" containerID="c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.956354 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} err="failed to get container status \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": rpc error: code = NotFound desc = could not find container \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": container with ID starting with c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.956382 5127 scope.go:117] "RemoveContainer" containerID="ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.956792 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} err="failed to get container status \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": rpc error: code = NotFound desc = could not find container \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": container with ID starting with ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.956821 5127 scope.go:117] "RemoveContainer" containerID="16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.957774 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} err="failed to get container status \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": rpc error: code = NotFound desc = could not find container \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": container with ID starting with 16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.957836 5127 scope.go:117] "RemoveContainer" containerID="dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.958425 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} err="failed to get container status \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": rpc error: code = NotFound desc = could not find container \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": container with ID starting with dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.958455 5127 scope.go:117] "RemoveContainer" containerID="e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.958767 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} err="failed to get container status \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": rpc error: code = NotFound desc = could not find container \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": container with ID starting with e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.958822 5127 scope.go:117] "RemoveContainer" containerID="836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.959122 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} err="failed to get container status \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": rpc error: code = NotFound desc = could not find container \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": container with ID starting with 836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.959183 5127 scope.go:117] "RemoveContainer" containerID="d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.959466 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} err="failed to get container status \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": rpc error: code = NotFound desc = could not find container \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": container with ID starting with d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.959521 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.960050 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} err="failed to get container status \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": rpc error: code = NotFound desc = could not find container \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": container with ID starting with a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.960133 5127 scope.go:117] "RemoveContainer" containerID="e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.960748 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} err="failed to get container status \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": rpc error: code = NotFound desc = could not find container \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": container with ID starting with e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.960790 5127 scope.go:117] "RemoveContainer" containerID="d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.961099 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} err="failed to get container status \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": rpc error: code = NotFound desc = could not find container \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": container with ID starting with d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.961134 5127 scope.go:117] "RemoveContainer" containerID="c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.962538 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} err="failed to get container status \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": rpc error: code = NotFound desc = could not find container \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": container with ID starting with c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.962560 5127 scope.go:117] "RemoveContainer" containerID="ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.962922 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} err="failed to get container status \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": rpc error: code = NotFound desc = could not find container \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": container with ID starting with ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.962953 5127 scope.go:117] "RemoveContainer" containerID="16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.964185 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} err="failed to get container status \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": rpc error: code = NotFound desc = could not find container \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": container with ID starting with 16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.964208 5127 scope.go:117] "RemoveContainer" containerID="dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.964556 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} err="failed to get container status \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": rpc error: code = NotFound desc = could not find container \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": container with ID starting with dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.964627 5127 scope.go:117] "RemoveContainer" containerID="e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.965342 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} err="failed to get container status \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": rpc error: code = NotFound desc = could not find container \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": container with ID starting with e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.965377 5127 scope.go:117] "RemoveContainer" containerID="836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.965942 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} err="failed to get container status \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": rpc error: code = NotFound desc = could not find container \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": container with ID starting with 836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.965966 5127 scope.go:117] "RemoveContainer" containerID="d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.966504 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d"} err="failed to get container status \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": rpc error: code = NotFound desc = could not find container \"d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d\": container with ID starting with d82347a32d41ae5a75252539e2bc9439402a6c71c9c01df5c1105c234e60199d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.966528 5127 scope.go:117] "RemoveContainer" containerID="a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.966822 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589"} err="failed to get container status \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": rpc error: code = NotFound desc = could not find container \"a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589\": container with ID starting with a92886b991d59b1553eb23251492d4705168b6d4ff970372ffa4d127423da589 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.966863 5127 scope.go:117] "RemoveContainer" containerID="e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.967163 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a"} err="failed to get container status \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": rpc error: code = NotFound desc = could not find container \"e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a\": container with ID starting with e6b3bb041365bd86fda35c34ffb0b321c968bb2522dd4fa4cd2923f6c7ccca9a not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.967187 5127 scope.go:117] "RemoveContainer" containerID="d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.967653 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5"} err="failed to get container status \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": rpc error: code = NotFound desc = could not find container \"d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5\": container with ID starting with d95f98b50ec994e4e40dcbc92d3e8278e4735492021d4f1777a164c10c4fc7e5 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.967674 5127 scope.go:117] "RemoveContainer" containerID="c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.967940 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc"} err="failed to get container status \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": rpc error: code = NotFound desc = could not find container \"c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc\": container with ID starting with c59459d0f5366af34b8ab8d55e72105ce9e5b1185e229482bc41fcce6df0b5dc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.967960 5127 scope.go:117] "RemoveContainer" containerID="ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968148 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76"} err="failed to get container status \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": rpc error: code = NotFound desc = could not find container \"ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76\": container with ID starting with ff6b3667785f68df68a249379fbd94f6bc4195ff9bf11c752697217c47404a76 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968176 5127 scope.go:117] "RemoveContainer" containerID="16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968359 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc"} err="failed to get container status \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": rpc error: code = NotFound desc = could not find container \"16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc\": container with ID starting with 16092c0f10abe4dd037d4c7f355b12a9ddef01f972537c9ee25b077feeb96bbc not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968375 5127 scope.go:117] "RemoveContainer" containerID="dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968547 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d"} err="failed to get container status \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": rpc error: code = NotFound desc = could not find container \"dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d\": container with ID starting with dc6b32cec4d94e92c9f34fb7618e997659a7fe3dd35c191ea87ab10ce58b166d not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968571 5127 scope.go:117] "RemoveContainer" containerID="e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968776 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82"} err="failed to get container status \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": rpc error: code = NotFound desc = could not find container \"e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82\": container with ID starting with e9737f12507ba00acbfecd942c622a5f7650b86228a10b912278c9e46ae26c82 not found: ID does not exist" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968791 5127 scope.go:117] "RemoveContainer" containerID="836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8" Feb 01 07:01:18 crc kubenswrapper[5127]: I0201 07:01:18.968964 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8"} err="failed to get container status \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": rpc error: code = NotFound desc = could not find container \"836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8\": container with ID starting with 836a62a954f4664078d35534d0ff2e126398eb9ae3603395667f6b963a17e2f8 not found: ID does not exist" Feb 01 07:01:19 crc kubenswrapper[5127]: I0201 07:01:19.700821 5127 generic.go:334] "Generic (PLEG): container finished" podID="2f20317a-fc8b-4798-8b2b-9d763672282a" containerID="13aa15d99799a43cdc476cf977d62ce39ea96ac042dd94434adf84145b80d009" exitCode=0 Feb 01 07:01:19 crc kubenswrapper[5127]: I0201 07:01:19.700909 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerDied","Data":"13aa15d99799a43cdc476cf977d62ce39ea96ac042dd94434adf84145b80d009"} Feb 01 07:01:19 crc kubenswrapper[5127]: I0201 07:01:19.701323 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"d0183135f0be4ef1a61e09cb2a929c0896e7a4ef6528b747a2b683d0edba29ab"} Feb 01 07:01:19 crc kubenswrapper[5127]: I0201 07:01:19.706001 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cmdjj_4d959741-37e1-43e7-9ef6-5f33433f9447/kube-multus/2.log" Feb 01 07:01:19 crc kubenswrapper[5127]: I0201 07:01:19.706253 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cmdjj" event={"ID":"4d959741-37e1-43e7-9ef6-5f33433f9447","Type":"ContainerStarted","Data":"800661e08555dbb769d2dabec07757fd00667d11d2d76184d5f69150955bf0f6"} Feb 01 07:01:20 crc kubenswrapper[5127]: I0201 07:01:20.243498 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5034ec6a-7968-4592-a09b-a57a56ebdbc5" path="/var/lib/kubelet/pods/5034ec6a-7968-4592-a09b-a57a56ebdbc5/volumes" Feb 01 07:01:20 crc kubenswrapper[5127]: I0201 07:01:20.716372 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"4def3f13d3e27c459a5f9019e24aa39fd55d502e9b5f4fc9a5a1674f8bf0c4a9"} Feb 01 07:01:20 crc kubenswrapper[5127]: I0201 07:01:20.716574 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"6926538229dba07fe1b61284880c2944a516a9d50e448e9a1e45192f5a4c4727"} Feb 01 07:01:20 crc kubenswrapper[5127]: I0201 07:01:20.716616 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"7c575a07abe44a758da9e96b60190dfa75025590d9a8b7f962fb94d1e2b01049"} Feb 01 07:01:20 crc kubenswrapper[5127]: I0201 07:01:20.716626 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"68250ca942cf0da6e643ab8cf4de9b34b3de95b93ab37489ef404c08c6a63998"} Feb 01 07:01:21 crc kubenswrapper[5127]: I0201 07:01:21.725776 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"9bf5619fd46997c39bfdc799e94f5da483dcff2fe27ac1b821d6034d6c4e3b53"} Feb 01 07:01:21 crc kubenswrapper[5127]: I0201 07:01:21.726097 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"00ad14d5362bb5423c8ae9d003439d9b22a44968ca3769f2ff416a58a9007e92"} Feb 01 07:01:23 crc kubenswrapper[5127]: I0201 07:01:23.744065 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"5f25c1cf77d1276453cf06ed4b4409c87d6e37965a27367273a6cc78f808b7a7"} Feb 01 07:01:26 crc kubenswrapper[5127]: I0201 07:01:26.777499 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" event={"ID":"2f20317a-fc8b-4798-8b2b-9d763672282a","Type":"ContainerStarted","Data":"0874a7987f3e90ea27dd8c1ae72f59054e46dcb3e98e490d4e1699e0799e87c7"} Feb 01 07:01:26 crc kubenswrapper[5127]: I0201 07:01:26.778148 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:26 crc kubenswrapper[5127]: I0201 07:01:26.812073 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:26 crc kubenswrapper[5127]: I0201 07:01:26.813197 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" podStartSLOduration=8.813176271 podStartE2EDuration="8.813176271s" podCreationTimestamp="2026-02-01 07:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:01:26.809530084 +0000 UTC m=+837.295432467" watchObservedRunningTime="2026-02-01 07:01:26.813176271 +0000 UTC m=+837.299078634" Feb 01 07:01:27 crc kubenswrapper[5127]: I0201 07:01:27.784831 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:27 crc kubenswrapper[5127]: I0201 07:01:27.784897 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:27 crc kubenswrapper[5127]: I0201 07:01:27.810948 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.202093 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-4n6s2"] Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.203698 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.207052 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.208478 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.208653 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.211547 5127 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wxj58" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.211827 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4n6s2"] Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.229468 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-node-mnt\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.229738 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhk5\" (UniqueName: \"kubernetes.io/projected/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-kube-api-access-qdhk5\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.230034 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-crc-storage\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.331258 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-crc-storage\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.331359 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-node-mnt\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.331499 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhk5\" (UniqueName: \"kubernetes.io/projected/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-kube-api-access-qdhk5\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.331821 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-node-mnt\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.332502 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-crc-storage\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.364133 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhk5\" (UniqueName: \"kubernetes.io/projected/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-kube-api-access-qdhk5\") pod \"crc-storage-crc-4n6s2\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.529848 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.561708 5127 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(91ccd56df06ba50ddd84693c59cb6635e0a10dd0de3fff1bca23f81bbde1c793): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.561811 5127 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(91ccd56df06ba50ddd84693c59cb6635e0a10dd0de3fff1bca23f81bbde1c793): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.561849 5127 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(91ccd56df06ba50ddd84693c59cb6635e0a10dd0de3fff1bca23f81bbde1c793): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.561917 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-4n6s2_crc-storage(377e5b6c-e0fb-4e24-8b4d-19f66394ee94)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-4n6s2_crc-storage(377e5b6c-e0fb-4e24-8b4d-19f66394ee94)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(91ccd56df06ba50ddd84693c59cb6635e0a10dd0de3fff1bca23f81bbde1c793): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-4n6s2" podUID="377e5b6c-e0fb-4e24-8b4d-19f66394ee94" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.790905 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: I0201 07:01:28.791544 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.831777 5127 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(ebc3ce50d18aa957df8c55ab5255e6578b607e4f6e6e19c33a417b6ece494705): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.831909 5127 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(ebc3ce50d18aa957df8c55ab5255e6578b607e4f6e6e19c33a417b6ece494705): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.831950 5127 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(ebc3ce50d18aa957df8c55ab5255e6578b607e4f6e6e19c33a417b6ece494705): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:28 crc kubenswrapper[5127]: E0201 07:01:28.832053 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-4n6s2_crc-storage(377e5b6c-e0fb-4e24-8b4d-19f66394ee94)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-4n6s2_crc-storage(377e5b6c-e0fb-4e24-8b4d-19f66394ee94)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-4n6s2_crc-storage_377e5b6c-e0fb-4e24-8b4d-19f66394ee94_0(ebc3ce50d18aa957df8c55ab5255e6578b607e4f6e6e19c33a417b6ece494705): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-4n6s2" podUID="377e5b6c-e0fb-4e24-8b4d-19f66394ee94" Feb 01 07:01:36 crc kubenswrapper[5127]: I0201 07:01:36.740867 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:01:36 crc kubenswrapper[5127]: I0201 07:01:36.741536 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:01:36 crc kubenswrapper[5127]: I0201 07:01:36.741625 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:01:36 crc kubenswrapper[5127]: I0201 07:01:36.742397 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0e3c3031601b19c3efb39d2ef7a904dac678201b3c13a21ac60151e7cab62c7"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:01:36 crc kubenswrapper[5127]: I0201 07:01:36.742494 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://c0e3c3031601b19c3efb39d2ef7a904dac678201b3c13a21ac60151e7cab62c7" gracePeriod=600 Feb 01 07:01:37 crc kubenswrapper[5127]: I0201 07:01:37.856552 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="c0e3c3031601b19c3efb39d2ef7a904dac678201b3c13a21ac60151e7cab62c7" exitCode=0 Feb 01 07:01:37 crc kubenswrapper[5127]: I0201 07:01:37.856655 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"c0e3c3031601b19c3efb39d2ef7a904dac678201b3c13a21ac60151e7cab62c7"} Feb 01 07:01:37 crc kubenswrapper[5127]: I0201 07:01:37.857192 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"fea23606e9e9fd1c229db27d18cd60b7a13de794804404b3c4e12726e4ef14d3"} Feb 01 07:01:37 crc kubenswrapper[5127]: I0201 07:01:37.857225 5127 scope.go:117] "RemoveContainer" containerID="256bf9829fc049633822f2493668187c71ca130c9e57dfc280716cc083afada1" Feb 01 07:01:41 crc kubenswrapper[5127]: I0201 07:01:41.235084 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:41 crc kubenswrapper[5127]: I0201 07:01:41.236172 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:41 crc kubenswrapper[5127]: I0201 07:01:41.475947 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-4n6s2"] Feb 01 07:01:41 crc kubenswrapper[5127]: I0201 07:01:41.497253 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:01:41 crc kubenswrapper[5127]: I0201 07:01:41.889286 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4n6s2" event={"ID":"377e5b6c-e0fb-4e24-8b4d-19f66394ee94","Type":"ContainerStarted","Data":"b90345a12eb1313d3ac01b10ca3b38ac07ed4667fe530a288ebe071d9a3ccdb2"} Feb 01 07:01:42 crc kubenswrapper[5127]: I0201 07:01:42.897188 5127 generic.go:334] "Generic (PLEG): container finished" podID="377e5b6c-e0fb-4e24-8b4d-19f66394ee94" containerID="3f0de04c7ee6cbadfdd33a4d628cf344e5cc5c64b0f94574b5ece8c9a0e85473" exitCode=0 Feb 01 07:01:42 crc kubenswrapper[5127]: I0201 07:01:42.897275 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4n6s2" event={"ID":"377e5b6c-e0fb-4e24-8b4d-19f66394ee94","Type":"ContainerDied","Data":"3f0de04c7ee6cbadfdd33a4d628cf344e5cc5c64b0f94574b5ece8c9a0e85473"} Feb 01 07:01:44 crc kubenswrapper[5127]: E0201 07:01:44.053439 5127 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.144991 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.268435 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdhk5\" (UniqueName: \"kubernetes.io/projected/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-kube-api-access-qdhk5\") pod \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.268617 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-node-mnt\") pod \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.268755 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-crc-storage\") pod \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\" (UID: \"377e5b6c-e0fb-4e24-8b4d-19f66394ee94\") " Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.269154 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "377e5b6c-e0fb-4e24-8b4d-19f66394ee94" (UID: "377e5b6c-e0fb-4e24-8b4d-19f66394ee94"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.278219 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-kube-api-access-qdhk5" (OuterVolumeSpecName: "kube-api-access-qdhk5") pod "377e5b6c-e0fb-4e24-8b4d-19f66394ee94" (UID: "377e5b6c-e0fb-4e24-8b4d-19f66394ee94"). InnerVolumeSpecName "kube-api-access-qdhk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.287028 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "377e5b6c-e0fb-4e24-8b4d-19f66394ee94" (UID: "377e5b6c-e0fb-4e24-8b4d-19f66394ee94"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.370072 5127 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.370132 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdhk5\" (UniqueName: \"kubernetes.io/projected/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-kube-api-access-qdhk5\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.370153 5127 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/377e5b6c-e0fb-4e24-8b4d-19f66394ee94-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.910495 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-4n6s2" event={"ID":"377e5b6c-e0fb-4e24-8b4d-19f66394ee94","Type":"ContainerDied","Data":"b90345a12eb1313d3ac01b10ca3b38ac07ed4667fe530a288ebe071d9a3ccdb2"} Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.910534 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b90345a12eb1313d3ac01b10ca3b38ac07ed4667fe530a288ebe071d9a3ccdb2" Feb 01 07:01:44 crc kubenswrapper[5127]: I0201 07:01:44.910647 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-4n6s2" Feb 01 07:01:48 crc kubenswrapper[5127]: I0201 07:01:48.918969 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vp2ht" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.730300 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw"] Feb 01 07:01:51 crc kubenswrapper[5127]: E0201 07:01:51.730820 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377e5b6c-e0fb-4e24-8b4d-19f66394ee94" containerName="storage" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.730834 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="377e5b6c-e0fb-4e24-8b4d-19f66394ee94" containerName="storage" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.730934 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="377e5b6c-e0fb-4e24-8b4d-19f66394ee94" containerName="storage" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.731676 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.734057 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.744903 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw"] Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.773566 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.774054 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.774139 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xp7z\" (UniqueName: \"kubernetes.io/projected/34f9a5ce-8747-43ec-827e-8392c57165df-kube-api-access-2xp7z\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.875829 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.875958 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.876071 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xp7z\" (UniqueName: \"kubernetes.io/projected/34f9a5ce-8747-43ec-827e-8392c57165df-kube-api-access-2xp7z\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.877237 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.877374 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:51 crc kubenswrapper[5127]: I0201 07:01:51.912193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xp7z\" (UniqueName: \"kubernetes.io/projected/34f9a5ce-8747-43ec-827e-8392c57165df-kube-api-access-2xp7z\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:52 crc kubenswrapper[5127]: I0201 07:01:52.051761 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:52 crc kubenswrapper[5127]: I0201 07:01:52.283123 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw"] Feb 01 07:01:52 crc kubenswrapper[5127]: W0201 07:01:52.292412 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f9a5ce_8747_43ec_827e_8392c57165df.slice/crio-73d753aaa4727689484ed514d211f55970cf1beb79b395579559ae4c6db04640 WatchSource:0}: Error finding container 73d753aaa4727689484ed514d211f55970cf1beb79b395579559ae4c6db04640: Status 404 returned error can't find the container with id 73d753aaa4727689484ed514d211f55970cf1beb79b395579559ae4c6db04640 Feb 01 07:01:52 crc kubenswrapper[5127]: I0201 07:01:52.964229 5127 generic.go:334] "Generic (PLEG): container finished" podID="34f9a5ce-8747-43ec-827e-8392c57165df" containerID="ceb5f5fdbed8c13c0f4fdcfc3a0087a1e20ed14247c38916b0a81844a6d7ade2" exitCode=0 Feb 01 07:01:52 crc kubenswrapper[5127]: I0201 07:01:52.964315 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" event={"ID":"34f9a5ce-8747-43ec-827e-8392c57165df","Type":"ContainerDied","Data":"ceb5f5fdbed8c13c0f4fdcfc3a0087a1e20ed14247c38916b0a81844a6d7ade2"} Feb 01 07:01:52 crc kubenswrapper[5127]: I0201 07:01:52.964367 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" event={"ID":"34f9a5ce-8747-43ec-827e-8392c57165df","Type":"ContainerStarted","Data":"73d753aaa4727689484ed514d211f55970cf1beb79b395579559ae4c6db04640"} Feb 01 07:01:53 crc kubenswrapper[5127]: I0201 07:01:53.980350 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grn6s"] Feb 01 07:01:53 crc kubenswrapper[5127]: I0201 07:01:53.982576 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.006781 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grn6s"] Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.014540 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dljsh\" (UniqueName: \"kubernetes.io/projected/df8546f7-eca8-4487-b502-5293a676dc22-kube-api-access-dljsh\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.014732 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-catalog-content\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.014925 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-utilities\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.115979 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-utilities\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.116084 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dljsh\" (UniqueName: \"kubernetes.io/projected/df8546f7-eca8-4487-b502-5293a676dc22-kube-api-access-dljsh\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.116104 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-catalog-content\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.116564 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-catalog-content\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.116854 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-utilities\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.155072 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dljsh\" (UniqueName: \"kubernetes.io/projected/df8546f7-eca8-4487-b502-5293a676dc22-kube-api-access-dljsh\") pod \"redhat-operators-grn6s\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.310928 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.515964 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grn6s"] Feb 01 07:01:54 crc kubenswrapper[5127]: W0201 07:01:54.559270 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf8546f7_eca8_4487_b502_5293a676dc22.slice/crio-236a993166fba02bda074d1c1004442b2b68eddc085c294250f2bdec20ace03e WatchSource:0}: Error finding container 236a993166fba02bda074d1c1004442b2b68eddc085c294250f2bdec20ace03e: Status 404 returned error can't find the container with id 236a993166fba02bda074d1c1004442b2b68eddc085c294250f2bdec20ace03e Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.985929 5127 generic.go:334] "Generic (PLEG): container finished" podID="34f9a5ce-8747-43ec-827e-8392c57165df" containerID="8eab84b47b3810352a65773d5866991c7526d80cecbf1f14c5f8d5ee5e245dcf" exitCode=0 Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.986007 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" event={"ID":"34f9a5ce-8747-43ec-827e-8392c57165df","Type":"ContainerDied","Data":"8eab84b47b3810352a65773d5866991c7526d80cecbf1f14c5f8d5ee5e245dcf"} Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.987624 5127 generic.go:334] "Generic (PLEG): container finished" podID="df8546f7-eca8-4487-b502-5293a676dc22" containerID="8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7" exitCode=0 Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.987675 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grn6s" event={"ID":"df8546f7-eca8-4487-b502-5293a676dc22","Type":"ContainerDied","Data":"8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7"} Feb 01 07:01:54 crc kubenswrapper[5127]: I0201 07:01:54.987704 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grn6s" event={"ID":"df8546f7-eca8-4487-b502-5293a676dc22","Type":"ContainerStarted","Data":"236a993166fba02bda074d1c1004442b2b68eddc085c294250f2bdec20ace03e"} Feb 01 07:01:55 crc kubenswrapper[5127]: I0201 07:01:55.998658 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grn6s" event={"ID":"df8546f7-eca8-4487-b502-5293a676dc22","Type":"ContainerStarted","Data":"7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1"} Feb 01 07:01:56 crc kubenswrapper[5127]: I0201 07:01:56.001854 5127 generic.go:334] "Generic (PLEG): container finished" podID="34f9a5ce-8747-43ec-827e-8392c57165df" containerID="d1781909c1d68374334b673cbf6766d54f546624a9f581df63c6e65f0c9124f0" exitCode=0 Feb 01 07:01:56 crc kubenswrapper[5127]: I0201 07:01:56.002015 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" event={"ID":"34f9a5ce-8747-43ec-827e-8392c57165df","Type":"ContainerDied","Data":"d1781909c1d68374334b673cbf6766d54f546624a9f581df63c6e65f0c9124f0"} Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.011989 5127 generic.go:334] "Generic (PLEG): container finished" podID="df8546f7-eca8-4487-b502-5293a676dc22" containerID="7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1" exitCode=0 Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.012116 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grn6s" event={"ID":"df8546f7-eca8-4487-b502-5293a676dc22","Type":"ContainerDied","Data":"7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1"} Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.266897 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.355777 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xp7z\" (UniqueName: \"kubernetes.io/projected/34f9a5ce-8747-43ec-827e-8392c57165df-kube-api-access-2xp7z\") pod \"34f9a5ce-8747-43ec-827e-8392c57165df\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.355937 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-util\") pod \"34f9a5ce-8747-43ec-827e-8392c57165df\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.356008 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-bundle\") pod \"34f9a5ce-8747-43ec-827e-8392c57165df\" (UID: \"34f9a5ce-8747-43ec-827e-8392c57165df\") " Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.357671 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-bundle" (OuterVolumeSpecName: "bundle") pod "34f9a5ce-8747-43ec-827e-8392c57165df" (UID: "34f9a5ce-8747-43ec-827e-8392c57165df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.370605 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f9a5ce-8747-43ec-827e-8392c57165df-kube-api-access-2xp7z" (OuterVolumeSpecName: "kube-api-access-2xp7z") pod "34f9a5ce-8747-43ec-827e-8392c57165df" (UID: "34f9a5ce-8747-43ec-827e-8392c57165df"). InnerVolumeSpecName "kube-api-access-2xp7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.375246 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-util" (OuterVolumeSpecName: "util") pod "34f9a5ce-8747-43ec-827e-8392c57165df" (UID: "34f9a5ce-8747-43ec-827e-8392c57165df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.457637 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xp7z\" (UniqueName: \"kubernetes.io/projected/34f9a5ce-8747-43ec-827e-8392c57165df-kube-api-access-2xp7z\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.457955 5127 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-util\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:57 crc kubenswrapper[5127]: I0201 07:01:57.458023 5127 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f9a5ce-8747-43ec-827e-8392c57165df-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:01:58 crc kubenswrapper[5127]: I0201 07:01:58.024261 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grn6s" event={"ID":"df8546f7-eca8-4487-b502-5293a676dc22","Type":"ContainerStarted","Data":"951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f"} Feb 01 07:01:58 crc kubenswrapper[5127]: I0201 07:01:58.028436 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" event={"ID":"34f9a5ce-8747-43ec-827e-8392c57165df","Type":"ContainerDied","Data":"73d753aaa4727689484ed514d211f55970cf1beb79b395579559ae4c6db04640"} Feb 01 07:01:58 crc kubenswrapper[5127]: I0201 07:01:58.028645 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d753aaa4727689484ed514d211f55970cf1beb79b395579559ae4c6db04640" Feb 01 07:01:58 crc kubenswrapper[5127]: I0201 07:01:58.028510 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw" Feb 01 07:01:58 crc kubenswrapper[5127]: I0201 07:01:58.057985 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grn6s" podStartSLOduration=2.564066243 podStartE2EDuration="5.057952758s" podCreationTimestamp="2026-02-01 07:01:53 +0000 UTC" firstStartedPulling="2026-02-01 07:01:54.988748738 +0000 UTC m=+865.474651101" lastFinishedPulling="2026-02-01 07:01:57.482635253 +0000 UTC m=+867.968537616" observedRunningTime="2026-02-01 07:01:58.046714127 +0000 UTC m=+868.532616540" watchObservedRunningTime="2026-02-01 07:01:58.057952758 +0000 UTC m=+868.543855161" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.098734 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-97zr9"] Feb 01 07:02:02 crc kubenswrapper[5127]: E0201 07:02:02.099307 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f9a5ce-8747-43ec-827e-8392c57165df" containerName="pull" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.099320 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f9a5ce-8747-43ec-827e-8392c57165df" containerName="pull" Feb 01 07:02:02 crc kubenswrapper[5127]: E0201 07:02:02.099341 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f9a5ce-8747-43ec-827e-8392c57165df" containerName="util" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.099349 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f9a5ce-8747-43ec-827e-8392c57165df" containerName="util" Feb 01 07:02:02 crc kubenswrapper[5127]: E0201 07:02:02.099387 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f9a5ce-8747-43ec-827e-8392c57165df" containerName="extract" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.099395 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f9a5ce-8747-43ec-827e-8392c57165df" containerName="extract" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.099510 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f9a5ce-8747-43ec-827e-8392c57165df" containerName="extract" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.100012 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.101720 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-q4jr4" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.101849 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.102283 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.108158 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-97zr9"] Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.118066 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjdws\" (UniqueName: \"kubernetes.io/projected/0cbcc60e-a163-42c1-871e-fd30c9d8e0f8-kube-api-access-kjdws\") pod \"nmstate-operator-646758c888-97zr9\" (UID: \"0cbcc60e-a163-42c1-871e-fd30c9d8e0f8\") " pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.219391 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjdws\" (UniqueName: \"kubernetes.io/projected/0cbcc60e-a163-42c1-871e-fd30c9d8e0f8-kube-api-access-kjdws\") pod \"nmstate-operator-646758c888-97zr9\" (UID: \"0cbcc60e-a163-42c1-871e-fd30c9d8e0f8\") " pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.235936 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjdws\" (UniqueName: \"kubernetes.io/projected/0cbcc60e-a163-42c1-871e-fd30c9d8e0f8-kube-api-access-kjdws\") pod \"nmstate-operator-646758c888-97zr9\" (UID: \"0cbcc60e-a163-42c1-871e-fd30c9d8e0f8\") " pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.476239 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" Feb 01 07:02:02 crc kubenswrapper[5127]: I0201 07:02:02.694704 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-97zr9"] Feb 01 07:02:03 crc kubenswrapper[5127]: I0201 07:02:03.057868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" event={"ID":"0cbcc60e-a163-42c1-871e-fd30c9d8e0f8","Type":"ContainerStarted","Data":"329f94c7f87776cd5110905b96d88e1d6d5d927798af9555f9bfbb18049b836d"} Feb 01 07:02:04 crc kubenswrapper[5127]: I0201 07:02:04.312376 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:02:04 crc kubenswrapper[5127]: I0201 07:02:04.312472 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:02:05 crc kubenswrapper[5127]: I0201 07:02:05.357731 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grn6s" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="registry-server" probeResult="failure" output=< Feb 01 07:02:05 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 07:02:05 crc kubenswrapper[5127]: > Feb 01 07:02:06 crc kubenswrapper[5127]: I0201 07:02:06.078615 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" event={"ID":"0cbcc60e-a163-42c1-871e-fd30c9d8e0f8","Type":"ContainerStarted","Data":"1d61f147235f94b9def5233e4974453e15a555299a5c345713ffcaba9b5c8455"} Feb 01 07:02:06 crc kubenswrapper[5127]: I0201 07:02:06.100168 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-97zr9" podStartSLOduration=1.823041376 podStartE2EDuration="4.100149144s" podCreationTimestamp="2026-02-01 07:02:02 +0000 UTC" firstStartedPulling="2026-02-01 07:02:02.717205178 +0000 UTC m=+873.203107541" lastFinishedPulling="2026-02-01 07:02:04.994312956 +0000 UTC m=+875.480215309" observedRunningTime="2026-02-01 07:02:06.098608122 +0000 UTC m=+876.584510495" watchObservedRunningTime="2026-02-01 07:02:06.100149144 +0000 UTC m=+876.586051507" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.571374 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g72lc"] Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.576059 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.602796 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g72lc"] Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.726111 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-utilities\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.726208 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-catalog-content\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.726248 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6vqm\" (UniqueName: \"kubernetes.io/projected/c51198e1-3aba-4c33-ba42-f72ee8971882-kube-api-access-f6vqm\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.827668 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-catalog-content\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.827733 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6vqm\" (UniqueName: \"kubernetes.io/projected/c51198e1-3aba-4c33-ba42-f72ee8971882-kube-api-access-f6vqm\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.827830 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-utilities\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.828204 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-catalog-content\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.828424 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-utilities\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.852426 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6vqm\" (UniqueName: \"kubernetes.io/projected/c51198e1-3aba-4c33-ba42-f72ee8971882-kube-api-access-f6vqm\") pod \"redhat-marketplace-g72lc\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:09 crc kubenswrapper[5127]: I0201 07:02:09.896700 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:10 crc kubenswrapper[5127]: I0201 07:02:10.151139 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g72lc"] Feb 01 07:02:11 crc kubenswrapper[5127]: I0201 07:02:11.114245 5127 generic.go:334] "Generic (PLEG): container finished" podID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerID="9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51" exitCode=0 Feb 01 07:02:11 crc kubenswrapper[5127]: I0201 07:02:11.114484 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g72lc" event={"ID":"c51198e1-3aba-4c33-ba42-f72ee8971882","Type":"ContainerDied","Data":"9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51"} Feb 01 07:02:11 crc kubenswrapper[5127]: I0201 07:02:11.114870 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g72lc" event={"ID":"c51198e1-3aba-4c33-ba42-f72ee8971882","Type":"ContainerStarted","Data":"a08ee632d8b853aa916dfabbd0c603c4dacb607fa93d5cde8b7424e28ae01790"} Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.000537 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-zlwp8"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.001596 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.003824 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-z8tfr" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.012578 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.013686 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.016464 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.017999 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-zlwp8"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.024100 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qmpft"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.024719 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.040850 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.074041 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-dbus-socket\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.074321 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xbn\" (UniqueName: \"kubernetes.io/projected/1f8bacee-d920-4264-bfc2-249be9f4c352-kube-api-access-n9xbn\") pod \"nmstate-metrics-54757c584b-zlwp8\" (UID: \"1f8bacee-d920-4264-bfc2-249be9f4c352\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.074343 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspzs\" (UniqueName: \"kubernetes.io/projected/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-kube-api-access-vspzs\") pod \"nmstate-webhook-8474b5b9d8-zpcmk\" (UID: \"2049e4f2-f393-4ce1-bd69-33f34b97b2a8\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.074401 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vdd\" (UniqueName: \"kubernetes.io/projected/eda42d6f-e70e-40d5-98f4-1329398803ae-kube-api-access-99vdd\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.074493 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-nmstate-lock\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.074875 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-zpcmk\" (UID: \"2049e4f2-f393-4ce1-bd69-33f34b97b2a8\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.074903 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-ovs-socket\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.121892 5127 generic.go:334] "Generic (PLEG): container finished" podID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerID="e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc" exitCode=0 Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.121933 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g72lc" event={"ID":"c51198e1-3aba-4c33-ba42-f72ee8971882","Type":"ContainerDied","Data":"e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc"} Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.128335 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.129703 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.131421 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-vgr4w" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.132285 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.132441 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.149785 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175613 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vdd\" (UniqueName: \"kubernetes.io/projected/eda42d6f-e70e-40d5-98f4-1329398803ae-kube-api-access-99vdd\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175670 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-nmstate-lock\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175700 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-zpcmk\" (UID: \"2049e4f2-f393-4ce1-bd69-33f34b97b2a8\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175718 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-ovs-socket\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175740 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1300e867-c21d-4450-bfa2-24c0ab3c8a21-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175760 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-dbus-socket\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175791 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1300e867-c21d-4450-bfa2-24c0ab3c8a21-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175819 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xbn\" (UniqueName: \"kubernetes.io/projected/1f8bacee-d920-4264-bfc2-249be9f4c352-kube-api-access-n9xbn\") pod \"nmstate-metrics-54757c584b-zlwp8\" (UID: \"1f8bacee-d920-4264-bfc2-249be9f4c352\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175825 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-nmstate-lock\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175886 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-ovs-socket\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: E0201 07:02:12.175956 5127 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 01 07:02:12 crc kubenswrapper[5127]: E0201 07:02:12.176013 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-tls-key-pair podName:2049e4f2-f393-4ce1-bd69-33f34b97b2a8 nodeName:}" failed. No retries permitted until 2026-02-01 07:02:12.675990987 +0000 UTC m=+883.161893460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-zpcmk" (UID: "2049e4f2-f393-4ce1-bd69-33f34b97b2a8") : secret "openshift-nmstate-webhook" not found Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.176038 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/eda42d6f-e70e-40d5-98f4-1329398803ae-dbus-socket\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.175837 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4v95\" (UniqueName: \"kubernetes.io/projected/1300e867-c21d-4450-bfa2-24c0ab3c8a21-kube-api-access-d4v95\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.176066 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspzs\" (UniqueName: \"kubernetes.io/projected/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-kube-api-access-vspzs\") pod \"nmstate-webhook-8474b5b9d8-zpcmk\" (UID: \"2049e4f2-f393-4ce1-bd69-33f34b97b2a8\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.194709 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vdd\" (UniqueName: \"kubernetes.io/projected/eda42d6f-e70e-40d5-98f4-1329398803ae-kube-api-access-99vdd\") pod \"nmstate-handler-qmpft\" (UID: \"eda42d6f-e70e-40d5-98f4-1329398803ae\") " pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.197109 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspzs\" (UniqueName: \"kubernetes.io/projected/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-kube-api-access-vspzs\") pod \"nmstate-webhook-8474b5b9d8-zpcmk\" (UID: \"2049e4f2-f393-4ce1-bd69-33f34b97b2a8\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.204534 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xbn\" (UniqueName: \"kubernetes.io/projected/1f8bacee-d920-4264-bfc2-249be9f4c352-kube-api-access-n9xbn\") pod \"nmstate-metrics-54757c584b-zlwp8\" (UID: \"1f8bacee-d920-4264-bfc2-249be9f4c352\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.276699 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1300e867-c21d-4450-bfa2-24c0ab3c8a21-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.276773 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1300e867-c21d-4450-bfa2-24c0ab3c8a21-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.276803 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4v95\" (UniqueName: \"kubernetes.io/projected/1300e867-c21d-4450-bfa2-24c0ab3c8a21-kube-api-access-d4v95\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: E0201 07:02:12.277023 5127 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 01 07:02:12 crc kubenswrapper[5127]: E0201 07:02:12.277128 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1300e867-c21d-4450-bfa2-24c0ab3c8a21-plugin-serving-cert podName:1300e867-c21d-4450-bfa2-24c0ab3c8a21 nodeName:}" failed. No retries permitted until 2026-02-01 07:02:12.777101096 +0000 UTC m=+883.263003469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/1300e867-c21d-4450-bfa2-24c0ab3c8a21-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-m6qwl" (UID: "1300e867-c21d-4450-bfa2-24c0ab3c8a21") : secret "plugin-serving-cert" not found Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.277558 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/1300e867-c21d-4450-bfa2-24c0ab3c8a21-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.291960 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4v95\" (UniqueName: \"kubernetes.io/projected/1300e867-c21d-4450-bfa2-24c0ab3c8a21-kube-api-access-d4v95\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.314920 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.315661 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c6cc56dfb-j944f"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.316538 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.330380 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c6cc56dfb-j944f"] Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.352080 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.377378 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-config\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.377458 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-service-ca\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.377564 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-oauth-serving-cert\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.377628 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-oauth-config\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.377694 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-serving-cert\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.377725 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-trusted-ca-bundle\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.377760 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlxxt\" (UniqueName: \"kubernetes.io/projected/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-kube-api-access-rlxxt\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.478661 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-service-ca\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.478948 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-oauth-serving-cert\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.478976 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-oauth-config\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.479009 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-serving-cert\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.479042 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-trusted-ca-bundle\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.479066 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlxxt\" (UniqueName: \"kubernetes.io/projected/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-kube-api-access-rlxxt\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.479095 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-config\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.479818 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-config\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.479899 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-service-ca\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.480383 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-oauth-serving-cert\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.481099 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-trusted-ca-bundle\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.484811 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-serving-cert\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.488104 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-console-oauth-config\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.496508 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlxxt\" (UniqueName: \"kubernetes.io/projected/c3e8ad10-e3bc-4211-93b5-5624fd1795cf-kube-api-access-rlxxt\") pod \"console-6c6cc56dfb-j944f\" (UID: \"c3e8ad10-e3bc-4211-93b5-5624fd1795cf\") " pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.513621 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-zlwp8"] Feb 01 07:02:12 crc kubenswrapper[5127]: W0201 07:02:12.527399 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f8bacee_d920_4264_bfc2_249be9f4c352.slice/crio-2c7586817c88732aa79ad015943422ba92a70e48ac07ce0d75eabcca63dc13dc WatchSource:0}: Error finding container 2c7586817c88732aa79ad015943422ba92a70e48ac07ce0d75eabcca63dc13dc: Status 404 returned error can't find the container with id 2c7586817c88732aa79ad015943422ba92a70e48ac07ce0d75eabcca63dc13dc Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.653347 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.682627 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-zpcmk\" (UID: \"2049e4f2-f393-4ce1-bd69-33f34b97b2a8\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.687043 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2049e4f2-f393-4ce1-bd69-33f34b97b2a8-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-zpcmk\" (UID: \"2049e4f2-f393-4ce1-bd69-33f34b97b2a8\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.785284 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1300e867-c21d-4450-bfa2-24c0ab3c8a21-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.790830 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/1300e867-c21d-4450-bfa2-24c0ab3c8a21-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-m6qwl\" (UID: \"1300e867-c21d-4450-bfa2-24c0ab3c8a21\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.844516 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c6cc56dfb-j944f"] Feb 01 07:02:12 crc kubenswrapper[5127]: W0201 07:02:12.856573 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3e8ad10_e3bc_4211_93b5_5624fd1795cf.slice/crio-bef97fb8d47ca0f0ba28e8a55ca31abc4d660b1976e2b087b4eb1edf533df79a WatchSource:0}: Error finding container bef97fb8d47ca0f0ba28e8a55ca31abc4d660b1976e2b087b4eb1edf533df79a: Status 404 returned error can't find the container with id bef97fb8d47ca0f0ba28e8a55ca31abc4d660b1976e2b087b4eb1edf533df79a Feb 01 07:02:12 crc kubenswrapper[5127]: I0201 07:02:12.928222 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.048153 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.130829 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qmpft" event={"ID":"eda42d6f-e70e-40d5-98f4-1329398803ae","Type":"ContainerStarted","Data":"99844a1df15c3d1c0ad6e97c81a41154e9a6215542d1e3678e693cfa151df03d"} Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.135485 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g72lc" event={"ID":"c51198e1-3aba-4c33-ba42-f72ee8971882","Type":"ContainerStarted","Data":"97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4"} Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.146247 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c6cc56dfb-j944f" event={"ID":"c3e8ad10-e3bc-4211-93b5-5624fd1795cf","Type":"ContainerStarted","Data":"81dedbdd0fcef77df57107393cfa2d0ed19adc5a1ad198226f8c76249798e86e"} Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.146297 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c6cc56dfb-j944f" event={"ID":"c3e8ad10-e3bc-4211-93b5-5624fd1795cf","Type":"ContainerStarted","Data":"bef97fb8d47ca0f0ba28e8a55ca31abc4d660b1976e2b087b4eb1edf533df79a"} Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.158383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" event={"ID":"1f8bacee-d920-4264-bfc2-249be9f4c352","Type":"ContainerStarted","Data":"2c7586817c88732aa79ad015943422ba92a70e48ac07ce0d75eabcca63dc13dc"} Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.169410 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g72lc" podStartSLOduration=2.730555442 podStartE2EDuration="4.169388962s" podCreationTimestamp="2026-02-01 07:02:09 +0000 UTC" firstStartedPulling="2026-02-01 07:02:11.118280529 +0000 UTC m=+881.604182922" lastFinishedPulling="2026-02-01 07:02:12.557114079 +0000 UTC m=+883.043016442" observedRunningTime="2026-02-01 07:02:13.168505699 +0000 UTC m=+883.654408062" watchObservedRunningTime="2026-02-01 07:02:13.169388962 +0000 UTC m=+883.655291325" Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.302126 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c6cc56dfb-j944f" podStartSLOduration=1.302110978 podStartE2EDuration="1.302110978s" podCreationTimestamp="2026-02-01 07:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:02:13.192563543 +0000 UTC m=+883.678465906" watchObservedRunningTime="2026-02-01 07:02:13.302110978 +0000 UTC m=+883.788013341" Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.303878 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl"] Feb 01 07:02:13 crc kubenswrapper[5127]: W0201 07:02:13.308345 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1300e867_c21d_4450_bfa2_24c0ab3c8a21.slice/crio-868e88946cd0edc13003ada10458747555c7156a5b7dbbe2eaa5106c23f98461 WatchSource:0}: Error finding container 868e88946cd0edc13003ada10458747555c7156a5b7dbbe2eaa5106c23f98461: Status 404 returned error can't find the container with id 868e88946cd0edc13003ada10458747555c7156a5b7dbbe2eaa5106c23f98461 Feb 01 07:02:13 crc kubenswrapper[5127]: I0201 07:02:13.353504 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk"] Feb 01 07:02:13 crc kubenswrapper[5127]: W0201 07:02:13.366390 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2049e4f2_f393_4ce1_bd69_33f34b97b2a8.slice/crio-9600da8eb6e8eecf2f904f84332c105a5f6b0decf835aa935df8351fa8f2a76c WatchSource:0}: Error finding container 9600da8eb6e8eecf2f904f84332c105a5f6b0decf835aa935df8351fa8f2a76c: Status 404 returned error can't find the container with id 9600da8eb6e8eecf2f904f84332c105a5f6b0decf835aa935df8351fa8f2a76c Feb 01 07:02:14 crc kubenswrapper[5127]: I0201 07:02:14.165348 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" event={"ID":"1300e867-c21d-4450-bfa2-24c0ab3c8a21","Type":"ContainerStarted","Data":"868e88946cd0edc13003ada10458747555c7156a5b7dbbe2eaa5106c23f98461"} Feb 01 07:02:14 crc kubenswrapper[5127]: I0201 07:02:14.166757 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" event={"ID":"2049e4f2-f393-4ce1-bd69-33f34b97b2a8","Type":"ContainerStarted","Data":"9600da8eb6e8eecf2f904f84332c105a5f6b0decf835aa935df8351fa8f2a76c"} Feb 01 07:02:14 crc kubenswrapper[5127]: I0201 07:02:14.368621 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:02:14 crc kubenswrapper[5127]: I0201 07:02:14.415890 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:02:15 crc kubenswrapper[5127]: I0201 07:02:15.959874 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grn6s"] Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.180412 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" event={"ID":"2049e4f2-f393-4ce1-bd69-33f34b97b2a8","Type":"ContainerStarted","Data":"a1d7b42c7aebb982ae4de376a4a99452cf34150206fb9b1968d7acf7bdcce8e5"} Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.180951 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.182619 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" event={"ID":"1f8bacee-d920-4264-bfc2-249be9f4c352","Type":"ContainerStarted","Data":"d809b4485922a82750c1a7e68339175c27bfc2bb17cdef497e7a073fe19b85a0"} Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.184720 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qmpft" event={"ID":"eda42d6f-e70e-40d5-98f4-1329398803ae","Type":"ContainerStarted","Data":"ad8a82adacacb8b0f9a794ccbf4cf197e063ec1a79a72160fd7b385af8d21753"} Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.184858 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.186848 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" event={"ID":"1300e867-c21d-4450-bfa2-24c0ab3c8a21","Type":"ContainerStarted","Data":"3cd7a3f382eeed790b571ffb5c30d1690efbdd71c2d44329fe64ceb7f9c29fb2"} Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.186917 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grn6s" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="registry-server" containerID="cri-o://951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f" gracePeriod=2 Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.198450 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" podStartSLOduration=3.5310138220000002 podStartE2EDuration="5.198430796s" podCreationTimestamp="2026-02-01 07:02:11 +0000 UTC" firstStartedPulling="2026-02-01 07:02:13.36900452 +0000 UTC m=+883.854906883" lastFinishedPulling="2026-02-01 07:02:15.036421494 +0000 UTC m=+885.522323857" observedRunningTime="2026-02-01 07:02:16.193685899 +0000 UTC m=+886.679588262" watchObservedRunningTime="2026-02-01 07:02:16.198430796 +0000 UTC m=+886.684333169" Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.214189 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qmpft" podStartSLOduration=1.5961338349999998 podStartE2EDuration="4.214171618s" podCreationTimestamp="2026-02-01 07:02:12 +0000 UTC" firstStartedPulling="2026-02-01 07:02:12.377733622 +0000 UTC m=+882.863635985" lastFinishedPulling="2026-02-01 07:02:14.995771405 +0000 UTC m=+885.481673768" observedRunningTime="2026-02-01 07:02:16.212564864 +0000 UTC m=+886.698467227" watchObservedRunningTime="2026-02-01 07:02:16.214171618 +0000 UTC m=+886.700073971" Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.967969 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:02:16 crc kubenswrapper[5127]: I0201 07:02:16.988037 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-m6qwl" podStartSLOduration=2.453324551 podStartE2EDuration="4.98800993s" podCreationTimestamp="2026-02-01 07:02:12 +0000 UTC" firstStartedPulling="2026-02-01 07:02:13.31002766 +0000 UTC m=+883.795930023" lastFinishedPulling="2026-02-01 07:02:15.844713049 +0000 UTC m=+886.330615402" observedRunningTime="2026-02-01 07:02:16.23963843 +0000 UTC m=+886.725540803" watchObservedRunningTime="2026-02-01 07:02:16.98800993 +0000 UTC m=+887.473912293" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.047947 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-catalog-content\") pod \"df8546f7-eca8-4487-b502-5293a676dc22\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.048097 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dljsh\" (UniqueName: \"kubernetes.io/projected/df8546f7-eca8-4487-b502-5293a676dc22-kube-api-access-dljsh\") pod \"df8546f7-eca8-4487-b502-5293a676dc22\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.050744 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-utilities\") pod \"df8546f7-eca8-4487-b502-5293a676dc22\" (UID: \"df8546f7-eca8-4487-b502-5293a676dc22\") " Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.051816 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-utilities" (OuterVolumeSpecName: "utilities") pod "df8546f7-eca8-4487-b502-5293a676dc22" (UID: "df8546f7-eca8-4487-b502-5293a676dc22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.057742 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df8546f7-eca8-4487-b502-5293a676dc22-kube-api-access-dljsh" (OuterVolumeSpecName: "kube-api-access-dljsh") pod "df8546f7-eca8-4487-b502-5293a676dc22" (UID: "df8546f7-eca8-4487-b502-5293a676dc22"). InnerVolumeSpecName "kube-api-access-dljsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.152081 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dljsh\" (UniqueName: \"kubernetes.io/projected/df8546f7-eca8-4487-b502-5293a676dc22-kube-api-access-dljsh\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.152110 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.168728 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df8546f7-eca8-4487-b502-5293a676dc22" (UID: "df8546f7-eca8-4487-b502-5293a676dc22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.194229 5127 generic.go:334] "Generic (PLEG): container finished" podID="df8546f7-eca8-4487-b502-5293a676dc22" containerID="951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f" exitCode=0 Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.194296 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grn6s" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.194296 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grn6s" event={"ID":"df8546f7-eca8-4487-b502-5293a676dc22","Type":"ContainerDied","Data":"951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f"} Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.194361 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grn6s" event={"ID":"df8546f7-eca8-4487-b502-5293a676dc22","Type":"ContainerDied","Data":"236a993166fba02bda074d1c1004442b2b68eddc085c294250f2bdec20ace03e"} Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.194384 5127 scope.go:117] "RemoveContainer" containerID="951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.212006 5127 scope.go:117] "RemoveContainer" containerID="7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.221874 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grn6s"] Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.225943 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grn6s"] Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.234802 5127 scope.go:117] "RemoveContainer" containerID="8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.253288 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df8546f7-eca8-4487-b502-5293a676dc22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.257274 5127 scope.go:117] "RemoveContainer" containerID="951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f" Feb 01 07:02:17 crc kubenswrapper[5127]: E0201 07:02:17.257845 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f\": container with ID starting with 951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f not found: ID does not exist" containerID="951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.257889 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f"} err="failed to get container status \"951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f\": rpc error: code = NotFound desc = could not find container \"951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f\": container with ID starting with 951beb9678a8c54f97b8c077a272738d68b744f857915c69e8aec981b004f99f not found: ID does not exist" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.257921 5127 scope.go:117] "RemoveContainer" containerID="7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1" Feb 01 07:02:17 crc kubenswrapper[5127]: E0201 07:02:17.258287 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1\": container with ID starting with 7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1 not found: ID does not exist" containerID="7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.258324 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1"} err="failed to get container status \"7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1\": rpc error: code = NotFound desc = could not find container \"7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1\": container with ID starting with 7c0e0e4757ed34f7d1c1d1d48824daad9a658ee2b7521de3eb79ef2af03858d1 not found: ID does not exist" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.258347 5127 scope.go:117] "RemoveContainer" containerID="8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7" Feb 01 07:02:17 crc kubenswrapper[5127]: E0201 07:02:17.258669 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7\": container with ID starting with 8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7 not found: ID does not exist" containerID="8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7" Feb 01 07:02:17 crc kubenswrapper[5127]: I0201 07:02:17.258696 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7"} err="failed to get container status \"8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7\": rpc error: code = NotFound desc = could not find container \"8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7\": container with ID starting with 8bab86f601419f75f00af5274c9fec039c86662ec5c1a7ee0c3093a4c79cf0a7 not found: ID does not exist" Feb 01 07:02:18 crc kubenswrapper[5127]: I0201 07:02:18.204649 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" event={"ID":"1f8bacee-d920-4264-bfc2-249be9f4c352","Type":"ContainerStarted","Data":"a4e971a3d22e66d030248a675db0c1237f1c77f08f8bcb4753bda08522a9a2ee"} Feb 01 07:02:18 crc kubenswrapper[5127]: I0201 07:02:18.230490 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-zlwp8" podStartSLOduration=2.7143850240000003 podStartE2EDuration="7.230325045s" podCreationTimestamp="2026-02-01 07:02:11 +0000 UTC" firstStartedPulling="2026-02-01 07:02:12.530809533 +0000 UTC m=+883.016711896" lastFinishedPulling="2026-02-01 07:02:17.046749554 +0000 UTC m=+887.532651917" observedRunningTime="2026-02-01 07:02:18.225839405 +0000 UTC m=+888.711741768" watchObservedRunningTime="2026-02-01 07:02:18.230325045 +0000 UTC m=+888.716227418" Feb 01 07:02:18 crc kubenswrapper[5127]: I0201 07:02:18.255370 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df8546f7-eca8-4487-b502-5293a676dc22" path="/var/lib/kubelet/pods/df8546f7-eca8-4487-b502-5293a676dc22/volumes" Feb 01 07:02:19 crc kubenswrapper[5127]: I0201 07:02:19.897383 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:19 crc kubenswrapper[5127]: I0201 07:02:19.897439 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:19 crc kubenswrapper[5127]: I0201 07:02:19.963465 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:20 crc kubenswrapper[5127]: I0201 07:02:20.268545 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:21 crc kubenswrapper[5127]: I0201 07:02:21.357248 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g72lc"] Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.239060 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g72lc" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="registry-server" containerID="cri-o://97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4" gracePeriod=2 Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.395000 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qmpft" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.624147 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.633362 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6vqm\" (UniqueName: \"kubernetes.io/projected/c51198e1-3aba-4c33-ba42-f72ee8971882-kube-api-access-f6vqm\") pod \"c51198e1-3aba-4c33-ba42-f72ee8971882\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.633456 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-catalog-content\") pod \"c51198e1-3aba-4c33-ba42-f72ee8971882\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.633509 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-utilities\") pod \"c51198e1-3aba-4c33-ba42-f72ee8971882\" (UID: \"c51198e1-3aba-4c33-ba42-f72ee8971882\") " Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.634345 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-utilities" (OuterVolumeSpecName: "utilities") pod "c51198e1-3aba-4c33-ba42-f72ee8971882" (UID: "c51198e1-3aba-4c33-ba42-f72ee8971882"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.638381 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51198e1-3aba-4c33-ba42-f72ee8971882-kube-api-access-f6vqm" (OuterVolumeSpecName: "kube-api-access-f6vqm") pod "c51198e1-3aba-4c33-ba42-f72ee8971882" (UID: "c51198e1-3aba-4c33-ba42-f72ee8971882"). InnerVolumeSpecName "kube-api-access-f6vqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.654026 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.654110 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.659062 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.660374 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c51198e1-3aba-4c33-ba42-f72ee8971882" (UID: "c51198e1-3aba-4c33-ba42-f72ee8971882"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.736263 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6vqm\" (UniqueName: \"kubernetes.io/projected/c51198e1-3aba-4c33-ba42-f72ee8971882-kube-api-access-f6vqm\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.736304 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:22 crc kubenswrapper[5127]: I0201 07:02:22.736313 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c51198e1-3aba-4c33-ba42-f72ee8971882-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.250316 5127 generic.go:334] "Generic (PLEG): container finished" podID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerID="97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4" exitCode=0 Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.250391 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g72lc" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.250482 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g72lc" event={"ID":"c51198e1-3aba-4c33-ba42-f72ee8971882","Type":"ContainerDied","Data":"97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4"} Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.250522 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g72lc" event={"ID":"c51198e1-3aba-4c33-ba42-f72ee8971882","Type":"ContainerDied","Data":"a08ee632d8b853aa916dfabbd0c603c4dacb607fa93d5cde8b7424e28ae01790"} Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.250559 5127 scope.go:117] "RemoveContainer" containerID="97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.258976 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c6cc56dfb-j944f" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.278839 5127 scope.go:117] "RemoveContainer" containerID="e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.313719 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g72lc"] Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.321743 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g72lc"] Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.335677 5127 scope.go:117] "RemoveContainer" containerID="9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.352139 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zfpgn"] Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.376137 5127 scope.go:117] "RemoveContainer" containerID="97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4" Feb 01 07:02:23 crc kubenswrapper[5127]: E0201 07:02:23.376982 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4\": container with ID starting with 97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4 not found: ID does not exist" containerID="97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.377141 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4"} err="failed to get container status \"97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4\": rpc error: code = NotFound desc = could not find container \"97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4\": container with ID starting with 97a01bc68a0b73b8038627ffecb25f6c309ab3d950eb46ba2636773e4b9e2fb4 not found: ID does not exist" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.377262 5127 scope.go:117] "RemoveContainer" containerID="e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc" Feb 01 07:02:23 crc kubenswrapper[5127]: E0201 07:02:23.377916 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc\": container with ID starting with e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc not found: ID does not exist" containerID="e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.377956 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc"} err="failed to get container status \"e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc\": rpc error: code = NotFound desc = could not find container \"e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc\": container with ID starting with e88eabe839d3cc21c6ba02135a8ed4b4dc5d80a9c72f31295ac5fffa64e07edc not found: ID does not exist" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.377984 5127 scope.go:117] "RemoveContainer" containerID="9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51" Feb 01 07:02:23 crc kubenswrapper[5127]: E0201 07:02:23.378437 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51\": container with ID starting with 9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51 not found: ID does not exist" containerID="9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51" Feb 01 07:02:23 crc kubenswrapper[5127]: I0201 07:02:23.378489 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51"} err="failed to get container status \"9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51\": rpc error: code = NotFound desc = could not find container \"9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51\": container with ID starting with 9b05c182fec52a5b7c9b6d6295214d8f2d5ba02fecfb1a10851ad80a0b1b1b51 not found: ID does not exist" Feb 01 07:02:24 crc kubenswrapper[5127]: I0201 07:02:24.248444 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" path="/var/lib/kubelet/pods/c51198e1-3aba-4c33-ba42-f72ee8971882/volumes" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.509007 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n4sv7"] Feb 01 07:02:27 crc kubenswrapper[5127]: E0201 07:02:27.509748 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="extract-utilities" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.509770 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="extract-utilities" Feb 01 07:02:27 crc kubenswrapper[5127]: E0201 07:02:27.509788 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="extract-utilities" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.509799 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="extract-utilities" Feb 01 07:02:27 crc kubenswrapper[5127]: E0201 07:02:27.509821 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="registry-server" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.509833 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="registry-server" Feb 01 07:02:27 crc kubenswrapper[5127]: E0201 07:02:27.509857 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="registry-server" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.509869 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="registry-server" Feb 01 07:02:27 crc kubenswrapper[5127]: E0201 07:02:27.509889 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="extract-content" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.509900 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="extract-content" Feb 01 07:02:27 crc kubenswrapper[5127]: E0201 07:02:27.509921 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="extract-content" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.509932 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="extract-content" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.510128 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c51198e1-3aba-4c33-ba42-f72ee8971882" containerName="registry-server" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.510150 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="df8546f7-eca8-4487-b502-5293a676dc22" containerName="registry-server" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.511427 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.519394 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4sv7"] Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.609549 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbqm2\" (UniqueName: \"kubernetes.io/projected/92a65ed2-e3e0-451c-b629-0e2f684d367a-kube-api-access-vbqm2\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.609616 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-catalog-content\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.609661 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-utilities\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.710516 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-catalog-content\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.710609 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-utilities\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.710717 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbqm2\" (UniqueName: \"kubernetes.io/projected/92a65ed2-e3e0-451c-b629-0e2f684d367a-kube-api-access-vbqm2\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.711126 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-catalog-content\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.711191 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-utilities\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.733763 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbqm2\" (UniqueName: \"kubernetes.io/projected/92a65ed2-e3e0-451c-b629-0e2f684d367a-kube-api-access-vbqm2\") pod \"community-operators-n4sv7\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:27 crc kubenswrapper[5127]: I0201 07:02:27.835727 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:28 crc kubenswrapper[5127]: I0201 07:02:28.092006 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4sv7"] Feb 01 07:02:28 crc kubenswrapper[5127]: I0201 07:02:28.299410 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4sv7" event={"ID":"92a65ed2-e3e0-451c-b629-0e2f684d367a","Type":"ContainerStarted","Data":"d1ecdae449740dbb7fabf06d069378314abc76666968c87a84f67dcfcb99b4a4"} Feb 01 07:02:29 crc kubenswrapper[5127]: I0201 07:02:29.308427 5127 generic.go:334] "Generic (PLEG): container finished" podID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerID="1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf" exitCode=0 Feb 01 07:02:29 crc kubenswrapper[5127]: I0201 07:02:29.308536 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4sv7" event={"ID":"92a65ed2-e3e0-451c-b629-0e2f684d367a","Type":"ContainerDied","Data":"1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf"} Feb 01 07:02:30 crc kubenswrapper[5127]: I0201 07:02:30.318952 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4sv7" event={"ID":"92a65ed2-e3e0-451c-b629-0e2f684d367a","Type":"ContainerStarted","Data":"604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3"} Feb 01 07:02:30 crc kubenswrapper[5127]: I0201 07:02:30.964606 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k8jch"] Feb 01 07:02:30 crc kubenswrapper[5127]: I0201 07:02:30.965927 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:30 crc kubenswrapper[5127]: I0201 07:02:30.990238 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8jch"] Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.051681 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-utilities\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.051752 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsq7k\" (UniqueName: \"kubernetes.io/projected/2cd79d87-6af0-4df7-86c9-2747a307a48d-kube-api-access-lsq7k\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.051773 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-catalog-content\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.152936 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-utilities\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.153036 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsq7k\" (UniqueName: \"kubernetes.io/projected/2cd79d87-6af0-4df7-86c9-2747a307a48d-kube-api-access-lsq7k\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.153145 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-catalog-content\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.153496 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-catalog-content\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.153501 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-utilities\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.173520 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsq7k\" (UniqueName: \"kubernetes.io/projected/2cd79d87-6af0-4df7-86c9-2747a307a48d-kube-api-access-lsq7k\") pod \"certified-operators-k8jch\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.297494 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.329753 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4sv7" event={"ID":"92a65ed2-e3e0-451c-b629-0e2f684d367a","Type":"ContainerDied","Data":"604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3"} Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.329325 5127 generic.go:334] "Generic (PLEG): container finished" podID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerID="604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3" exitCode=0 Feb 01 07:02:31 crc kubenswrapper[5127]: I0201 07:02:31.740926 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k8jch"] Feb 01 07:02:31 crc kubenswrapper[5127]: W0201 07:02:31.744404 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd79d87_6af0_4df7_86c9_2747a307a48d.slice/crio-a74324edbbcaf9ce1a44c7ed6cf7b9ee1843eb6cb87b23609555e7ab0019b2be WatchSource:0}: Error finding container a74324edbbcaf9ce1a44c7ed6cf7b9ee1843eb6cb87b23609555e7ab0019b2be: Status 404 returned error can't find the container with id a74324edbbcaf9ce1a44c7ed6cf7b9ee1843eb6cb87b23609555e7ab0019b2be Feb 01 07:02:32 crc kubenswrapper[5127]: I0201 07:02:32.338365 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4sv7" event={"ID":"92a65ed2-e3e0-451c-b629-0e2f684d367a","Type":"ContainerStarted","Data":"042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83"} Feb 01 07:02:32 crc kubenswrapper[5127]: I0201 07:02:32.339942 5127 generic.go:334] "Generic (PLEG): container finished" podID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerID="02043fde3936f75ad889b8d8bc3e06293ffc179b55ef66b9f3c2404cc293fec1" exitCode=0 Feb 01 07:02:32 crc kubenswrapper[5127]: I0201 07:02:32.339984 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8jch" event={"ID":"2cd79d87-6af0-4df7-86c9-2747a307a48d","Type":"ContainerDied","Data":"02043fde3936f75ad889b8d8bc3e06293ffc179b55ef66b9f3c2404cc293fec1"} Feb 01 07:02:32 crc kubenswrapper[5127]: I0201 07:02:32.340004 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8jch" event={"ID":"2cd79d87-6af0-4df7-86c9-2747a307a48d","Type":"ContainerStarted","Data":"a74324edbbcaf9ce1a44c7ed6cf7b9ee1843eb6cb87b23609555e7ab0019b2be"} Feb 01 07:02:32 crc kubenswrapper[5127]: I0201 07:02:32.360688 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n4sv7" podStartSLOduration=2.854068476 podStartE2EDuration="5.360670842s" podCreationTimestamp="2026-02-01 07:02:27 +0000 UTC" firstStartedPulling="2026-02-01 07:02:29.314969422 +0000 UTC m=+899.800871815" lastFinishedPulling="2026-02-01 07:02:31.821571818 +0000 UTC m=+902.307474181" observedRunningTime="2026-02-01 07:02:32.358215356 +0000 UTC m=+902.844117779" watchObservedRunningTime="2026-02-01 07:02:32.360670842 +0000 UTC m=+902.846573205" Feb 01 07:02:32 crc kubenswrapper[5127]: I0201 07:02:32.935884 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-zpcmk" Feb 01 07:02:33 crc kubenswrapper[5127]: I0201 07:02:33.348451 5127 generic.go:334] "Generic (PLEG): container finished" podID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerID="c278a3a5ad6c069dfd81bf852a5442e2c830b6c412ac2e13c49abd7427b9d87d" exitCode=0 Feb 01 07:02:33 crc kubenswrapper[5127]: I0201 07:02:33.348517 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8jch" event={"ID":"2cd79d87-6af0-4df7-86c9-2747a307a48d","Type":"ContainerDied","Data":"c278a3a5ad6c069dfd81bf852a5442e2c830b6c412ac2e13c49abd7427b9d87d"} Feb 01 07:02:34 crc kubenswrapper[5127]: I0201 07:02:34.356886 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8jch" event={"ID":"2cd79d87-6af0-4df7-86c9-2747a307a48d","Type":"ContainerStarted","Data":"340f29a6e98d908530127e31f0be98f6259110b05c182e9908abb9a103f0d78f"} Feb 01 07:02:34 crc kubenswrapper[5127]: I0201 07:02:34.380653 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k8jch" podStartSLOduration=2.957706968 podStartE2EDuration="4.380634791s" podCreationTimestamp="2026-02-01 07:02:30 +0000 UTC" firstStartedPulling="2026-02-01 07:02:32.341103308 +0000 UTC m=+902.827005671" lastFinishedPulling="2026-02-01 07:02:33.764031111 +0000 UTC m=+904.249933494" observedRunningTime="2026-02-01 07:02:34.379341456 +0000 UTC m=+904.865243809" watchObservedRunningTime="2026-02-01 07:02:34.380634791 +0000 UTC m=+904.866537154" Feb 01 07:02:37 crc kubenswrapper[5127]: I0201 07:02:37.836419 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:37 crc kubenswrapper[5127]: I0201 07:02:37.836808 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:37 crc kubenswrapper[5127]: I0201 07:02:37.875327 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:38 crc kubenswrapper[5127]: I0201 07:02:38.443173 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:38 crc kubenswrapper[5127]: I0201 07:02:38.500086 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4sv7"] Feb 01 07:02:40 crc kubenswrapper[5127]: I0201 07:02:40.396103 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n4sv7" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="registry-server" containerID="cri-o://042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83" gracePeriod=2 Feb 01 07:02:40 crc kubenswrapper[5127]: I0201 07:02:40.823416 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:40 crc kubenswrapper[5127]: I0201 07:02:40.983930 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-utilities\") pod \"92a65ed2-e3e0-451c-b629-0e2f684d367a\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " Feb 01 07:02:40 crc kubenswrapper[5127]: I0201 07:02:40.984028 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-catalog-content\") pod \"92a65ed2-e3e0-451c-b629-0e2f684d367a\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " Feb 01 07:02:40 crc kubenswrapper[5127]: I0201 07:02:40.984107 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbqm2\" (UniqueName: \"kubernetes.io/projected/92a65ed2-e3e0-451c-b629-0e2f684d367a-kube-api-access-vbqm2\") pod \"92a65ed2-e3e0-451c-b629-0e2f684d367a\" (UID: \"92a65ed2-e3e0-451c-b629-0e2f684d367a\") " Feb 01 07:02:40 crc kubenswrapper[5127]: I0201 07:02:40.984993 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-utilities" (OuterVolumeSpecName: "utilities") pod "92a65ed2-e3e0-451c-b629-0e2f684d367a" (UID: "92a65ed2-e3e0-451c-b629-0e2f684d367a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:40 crc kubenswrapper[5127]: I0201 07:02:40.991815 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a65ed2-e3e0-451c-b629-0e2f684d367a-kube-api-access-vbqm2" (OuterVolumeSpecName: "kube-api-access-vbqm2") pod "92a65ed2-e3e0-451c-b629-0e2f684d367a" (UID: "92a65ed2-e3e0-451c-b629-0e2f684d367a"). InnerVolumeSpecName "kube-api-access-vbqm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.086270 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbqm2\" (UniqueName: \"kubernetes.io/projected/92a65ed2-e3e0-451c-b629-0e2f684d367a-kube-api-access-vbqm2\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.086329 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.172953 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92a65ed2-e3e0-451c-b629-0e2f684d367a" (UID: "92a65ed2-e3e0-451c-b629-0e2f684d367a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.187482 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92a65ed2-e3e0-451c-b629-0e2f684d367a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.298260 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.298320 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.353005 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.402841 5127 generic.go:334] "Generic (PLEG): container finished" podID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerID="042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83" exitCode=0 Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.403052 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4sv7" event={"ID":"92a65ed2-e3e0-451c-b629-0e2f684d367a","Type":"ContainerDied","Data":"042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83"} Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.403104 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4sv7" event={"ID":"92a65ed2-e3e0-451c-b629-0e2f684d367a","Type":"ContainerDied","Data":"d1ecdae449740dbb7fabf06d069378314abc76666968c87a84f67dcfcb99b4a4"} Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.403124 5127 scope.go:117] "RemoveContainer" containerID="042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.403257 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4sv7" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.418988 5127 scope.go:117] "RemoveContainer" containerID="604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.440465 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4sv7"] Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.443225 5127 scope.go:117] "RemoveContainer" containerID="1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.443878 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.445041 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n4sv7"] Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.476566 5127 scope.go:117] "RemoveContainer" containerID="042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83" Feb 01 07:02:41 crc kubenswrapper[5127]: E0201 07:02:41.477380 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83\": container with ID starting with 042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83 not found: ID does not exist" containerID="042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.477424 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83"} err="failed to get container status \"042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83\": rpc error: code = NotFound desc = could not find container \"042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83\": container with ID starting with 042293530cecf60f690ba6dc1d27b8548f2dbad4001f82740ccc99e10e9ffb83 not found: ID does not exist" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.477455 5127 scope.go:117] "RemoveContainer" containerID="604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3" Feb 01 07:02:41 crc kubenswrapper[5127]: E0201 07:02:41.477984 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3\": container with ID starting with 604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3 not found: ID does not exist" containerID="604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.478013 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3"} err="failed to get container status \"604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3\": rpc error: code = NotFound desc = could not find container \"604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3\": container with ID starting with 604624c71a281b1bdd3c044c0ba4a7c5d5fd95d5b1a8be71d1042f70387c8fb3 not found: ID does not exist" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.478036 5127 scope.go:117] "RemoveContainer" containerID="1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf" Feb 01 07:02:41 crc kubenswrapper[5127]: E0201 07:02:41.478238 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf\": container with ID starting with 1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf not found: ID does not exist" containerID="1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf" Feb 01 07:02:41 crc kubenswrapper[5127]: I0201 07:02:41.478259 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf"} err="failed to get container status \"1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf\": rpc error: code = NotFound desc = could not find container \"1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf\": container with ID starting with 1eb989e9e049892134f6a842fbfcb05c648fc07fb200d50ddcc5cf12da9670bf not found: ID does not exist" Feb 01 07:02:42 crc kubenswrapper[5127]: I0201 07:02:42.244470 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" path="/var/lib/kubelet/pods/92a65ed2-e3e0-451c-b629-0e2f684d367a/volumes" Feb 01 07:02:43 crc kubenswrapper[5127]: I0201 07:02:43.758142 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8jch"] Feb 01 07:02:43 crc kubenswrapper[5127]: I0201 07:02:43.758876 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k8jch" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="registry-server" containerID="cri-o://340f29a6e98d908530127e31f0be98f6259110b05c182e9908abb9a103f0d78f" gracePeriod=2 Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.428622 5127 generic.go:334] "Generic (PLEG): container finished" podID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerID="340f29a6e98d908530127e31f0be98f6259110b05c182e9908abb9a103f0d78f" exitCode=0 Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.428711 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8jch" event={"ID":"2cd79d87-6af0-4df7-86c9-2747a307a48d","Type":"ContainerDied","Data":"340f29a6e98d908530127e31f0be98f6259110b05c182e9908abb9a103f0d78f"} Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.639266 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.832355 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-utilities\") pod \"2cd79d87-6af0-4df7-86c9-2747a307a48d\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.832421 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-catalog-content\") pod \"2cd79d87-6af0-4df7-86c9-2747a307a48d\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.832542 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsq7k\" (UniqueName: \"kubernetes.io/projected/2cd79d87-6af0-4df7-86c9-2747a307a48d-kube-api-access-lsq7k\") pod \"2cd79d87-6af0-4df7-86c9-2747a307a48d\" (UID: \"2cd79d87-6af0-4df7-86c9-2747a307a48d\") " Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.833238 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-utilities" (OuterVolumeSpecName: "utilities") pod "2cd79d87-6af0-4df7-86c9-2747a307a48d" (UID: "2cd79d87-6af0-4df7-86c9-2747a307a48d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.843837 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd79d87-6af0-4df7-86c9-2747a307a48d-kube-api-access-lsq7k" (OuterVolumeSpecName: "kube-api-access-lsq7k") pod "2cd79d87-6af0-4df7-86c9-2747a307a48d" (UID: "2cd79d87-6af0-4df7-86c9-2747a307a48d"). InnerVolumeSpecName "kube-api-access-lsq7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.874012 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cd79d87-6af0-4df7-86c9-2747a307a48d" (UID: "2cd79d87-6af0-4df7-86c9-2747a307a48d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.934626 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsq7k\" (UniqueName: \"kubernetes.io/projected/2cd79d87-6af0-4df7-86c9-2747a307a48d-kube-api-access-lsq7k\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.934675 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:44 crc kubenswrapper[5127]: I0201 07:02:44.934685 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd79d87-6af0-4df7-86c9-2747a307a48d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.440822 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k8jch" event={"ID":"2cd79d87-6af0-4df7-86c9-2747a307a48d","Type":"ContainerDied","Data":"a74324edbbcaf9ce1a44c7ed6cf7b9ee1843eb6cb87b23609555e7ab0019b2be"} Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.441077 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k8jch" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.442385 5127 scope.go:117] "RemoveContainer" containerID="340f29a6e98d908530127e31f0be98f6259110b05c182e9908abb9a103f0d78f" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.486612 5127 scope.go:117] "RemoveContainer" containerID="c278a3a5ad6c069dfd81bf852a5442e2c830b6c412ac2e13c49abd7427b9d87d" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.509052 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k8jch"] Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.512167 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k8jch"] Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.513840 5127 scope.go:117] "RemoveContainer" containerID="02043fde3936f75ad889b8d8bc3e06293ffc179b55ef66b9f3c2404cc293fec1" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.820500 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52"] Feb 01 07:02:45 crc kubenswrapper[5127]: E0201 07:02:45.820807 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="extract-content" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.820821 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="extract-content" Feb 01 07:02:45 crc kubenswrapper[5127]: E0201 07:02:45.820830 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="registry-server" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.820837 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="registry-server" Feb 01 07:02:45 crc kubenswrapper[5127]: E0201 07:02:45.820843 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="extract-utilities" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.820851 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="extract-utilities" Feb 01 07:02:45 crc kubenswrapper[5127]: E0201 07:02:45.820872 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="extract-content" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.820879 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="extract-content" Feb 01 07:02:45 crc kubenswrapper[5127]: E0201 07:02:45.820888 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="registry-server" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.820896 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="registry-server" Feb 01 07:02:45 crc kubenswrapper[5127]: E0201 07:02:45.820905 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="extract-utilities" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.820913 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="extract-utilities" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.821012 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" containerName="registry-server" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.821023 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a65ed2-e3e0-451c-b629-0e2f684d367a" containerName="registry-server" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.821768 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.824642 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.844405 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52"] Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.946263 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.946729 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:45 crc kubenswrapper[5127]: I0201 07:02:45.946848 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48z9q\" (UniqueName: \"kubernetes.io/projected/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-kube-api-access-48z9q\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.047992 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.048064 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48z9q\" (UniqueName: \"kubernetes.io/projected/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-kube-api-access-48z9q\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.048219 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.049035 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.049272 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.068930 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48z9q\" (UniqueName: \"kubernetes.io/projected/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-kube-api-access-48z9q\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.139398 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.248473 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd79d87-6af0-4df7-86c9-2747a307a48d" path="/var/lib/kubelet/pods/2cd79d87-6af0-4df7-86c9-2747a307a48d/volumes" Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.365636 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52"] Feb 01 07:02:46 crc kubenswrapper[5127]: I0201 07:02:46.448540 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" event={"ID":"e86fe952-2a5c-4a01-b82e-53ba47fc92c8","Type":"ContainerStarted","Data":"4dfbbd2e31730f7f63f1199cf706a7b6ae3df6b5fbf661bad040e17c67bde659"} Feb 01 07:02:47 crc kubenswrapper[5127]: I0201 07:02:47.469446 5127 generic.go:334] "Generic (PLEG): container finished" podID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerID="fdd3235fae8a483bdd059c51c9cbd66c30c716995689ad1462387ad0803c7bf7" exitCode=0 Feb 01 07:02:47 crc kubenswrapper[5127]: I0201 07:02:47.469562 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" event={"ID":"e86fe952-2a5c-4a01-b82e-53ba47fc92c8","Type":"ContainerDied","Data":"fdd3235fae8a483bdd059c51c9cbd66c30c716995689ad1462387ad0803c7bf7"} Feb 01 07:02:48 crc kubenswrapper[5127]: I0201 07:02:48.412239 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-zfpgn" podUID="99bc3500-5fb6-4d26-97dd-24dc06658294" containerName="console" containerID="cri-o://c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7" gracePeriod=15 Feb 01 07:02:48 crc kubenswrapper[5127]: I0201 07:02:48.844309 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zfpgn_99bc3500-5fb6-4d26-97dd-24dc06658294/console/0.log" Feb 01 07:02:48 crc kubenswrapper[5127]: I0201 07:02:48.844374 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.015499 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-oauth-serving-cert\") pod \"99bc3500-5fb6-4d26-97dd-24dc06658294\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.015976 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-service-ca\") pod \"99bc3500-5fb6-4d26-97dd-24dc06658294\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016013 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qzhv\" (UniqueName: \"kubernetes.io/projected/99bc3500-5fb6-4d26-97dd-24dc06658294-kube-api-access-6qzhv\") pod \"99bc3500-5fb6-4d26-97dd-24dc06658294\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016047 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-oauth-config\") pod \"99bc3500-5fb6-4d26-97dd-24dc06658294\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016152 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-serving-cert\") pod \"99bc3500-5fb6-4d26-97dd-24dc06658294\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016213 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-console-config\") pod \"99bc3500-5fb6-4d26-97dd-24dc06658294\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016253 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-trusted-ca-bundle\") pod \"99bc3500-5fb6-4d26-97dd-24dc06658294\" (UID: \"99bc3500-5fb6-4d26-97dd-24dc06658294\") " Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016566 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "99bc3500-5fb6-4d26-97dd-24dc06658294" (UID: "99bc3500-5fb6-4d26-97dd-24dc06658294"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016778 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-console-config" (OuterVolumeSpecName: "console-config") pod "99bc3500-5fb6-4d26-97dd-24dc06658294" (UID: "99bc3500-5fb6-4d26-97dd-24dc06658294"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.016791 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "99bc3500-5fb6-4d26-97dd-24dc06658294" (UID: "99bc3500-5fb6-4d26-97dd-24dc06658294"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.017311 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-service-ca" (OuterVolumeSpecName: "service-ca") pod "99bc3500-5fb6-4d26-97dd-24dc06658294" (UID: "99bc3500-5fb6-4d26-97dd-24dc06658294"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.023375 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "99bc3500-5fb6-4d26-97dd-24dc06658294" (UID: "99bc3500-5fb6-4d26-97dd-24dc06658294"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.024194 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "99bc3500-5fb6-4d26-97dd-24dc06658294" (UID: "99bc3500-5fb6-4d26-97dd-24dc06658294"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.025429 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bc3500-5fb6-4d26-97dd-24dc06658294-kube-api-access-6qzhv" (OuterVolumeSpecName: "kube-api-access-6qzhv") pod "99bc3500-5fb6-4d26-97dd-24dc06658294" (UID: "99bc3500-5fb6-4d26-97dd-24dc06658294"). InnerVolumeSpecName "kube-api-access-6qzhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.117400 5127 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-console-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.117440 5127 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.117449 5127 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.117457 5127 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/99bc3500-5fb6-4d26-97dd-24dc06658294-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.117467 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qzhv\" (UniqueName: \"kubernetes.io/projected/99bc3500-5fb6-4d26-97dd-24dc06658294-kube-api-access-6qzhv\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.117476 5127 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.117484 5127 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/99bc3500-5fb6-4d26-97dd-24dc06658294-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.483664 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-zfpgn_99bc3500-5fb6-4d26-97dd-24dc06658294/console/0.log" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.483751 5127 generic.go:334] "Generic (PLEG): container finished" podID="99bc3500-5fb6-4d26-97dd-24dc06658294" containerID="c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7" exitCode=2 Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.483873 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-zfpgn" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.483879 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zfpgn" event={"ID":"99bc3500-5fb6-4d26-97dd-24dc06658294","Type":"ContainerDied","Data":"c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7"} Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.484030 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-zfpgn" event={"ID":"99bc3500-5fb6-4d26-97dd-24dc06658294","Type":"ContainerDied","Data":"7d61dcbe3da3e2a463fdd846be5ce3880fc51e835b205f6c3f58c65bfadeb8fc"} Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.484060 5127 scope.go:117] "RemoveContainer" containerID="c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.486533 5127 generic.go:334] "Generic (PLEG): container finished" podID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerID="5f9012b50af7de2d34c28d9050ef0d2eeb6a33d4ab1aad0f3e3a6fc909680db7" exitCode=0 Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.486602 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" event={"ID":"e86fe952-2a5c-4a01-b82e-53ba47fc92c8","Type":"ContainerDied","Data":"5f9012b50af7de2d34c28d9050ef0d2eeb6a33d4ab1aad0f3e3a6fc909680db7"} Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.515037 5127 scope.go:117] "RemoveContainer" containerID="c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7" Feb 01 07:02:49 crc kubenswrapper[5127]: E0201 07:02:49.515525 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7\": container with ID starting with c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7 not found: ID does not exist" containerID="c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.515564 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7"} err="failed to get container status \"c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7\": rpc error: code = NotFound desc = could not find container \"c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7\": container with ID starting with c65c61d253ae47b7c2e9be161383272d29f272dfed5b9bf970d4335d5f7971f7 not found: ID does not exist" Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.535532 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-zfpgn"] Feb 01 07:02:49 crc kubenswrapper[5127]: I0201 07:02:49.542645 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-zfpgn"] Feb 01 07:02:50 crc kubenswrapper[5127]: I0201 07:02:50.243507 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bc3500-5fb6-4d26-97dd-24dc06658294" path="/var/lib/kubelet/pods/99bc3500-5fb6-4d26-97dd-24dc06658294/volumes" Feb 01 07:02:50 crc kubenswrapper[5127]: I0201 07:02:50.498022 5127 generic.go:334] "Generic (PLEG): container finished" podID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerID="81f0a6b52e5846656b13bda443f6585f76f401569bdf18803a23a0938afb563a" exitCode=0 Feb 01 07:02:50 crc kubenswrapper[5127]: I0201 07:02:50.498089 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" event={"ID":"e86fe952-2a5c-4a01-b82e-53ba47fc92c8","Type":"ContainerDied","Data":"81f0a6b52e5846656b13bda443f6585f76f401569bdf18803a23a0938afb563a"} Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.737980 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.759909 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-util\") pod \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.759971 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48z9q\" (UniqueName: \"kubernetes.io/projected/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-kube-api-access-48z9q\") pod \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.760026 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-bundle\") pod \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\" (UID: \"e86fe952-2a5c-4a01-b82e-53ba47fc92c8\") " Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.761534 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-bundle" (OuterVolumeSpecName: "bundle") pod "e86fe952-2a5c-4a01-b82e-53ba47fc92c8" (UID: "e86fe952-2a5c-4a01-b82e-53ba47fc92c8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.768497 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-kube-api-access-48z9q" (OuterVolumeSpecName: "kube-api-access-48z9q") pod "e86fe952-2a5c-4a01-b82e-53ba47fc92c8" (UID: "e86fe952-2a5c-4a01-b82e-53ba47fc92c8"). InnerVolumeSpecName "kube-api-access-48z9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.866759 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48z9q\" (UniqueName: \"kubernetes.io/projected/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-kube-api-access-48z9q\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.866865 5127 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.920050 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-util" (OuterVolumeSpecName: "util") pod "e86fe952-2a5c-4a01-b82e-53ba47fc92c8" (UID: "e86fe952-2a5c-4a01-b82e-53ba47fc92c8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:02:51 crc kubenswrapper[5127]: I0201 07:02:51.968612 5127 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e86fe952-2a5c-4a01-b82e-53ba47fc92c8-util\") on node \"crc\" DevicePath \"\"" Feb 01 07:02:52 crc kubenswrapper[5127]: I0201 07:02:52.512625 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" event={"ID":"e86fe952-2a5c-4a01-b82e-53ba47fc92c8","Type":"ContainerDied","Data":"4dfbbd2e31730f7f63f1199cf706a7b6ae3df6b5fbf661bad040e17c67bde659"} Feb 01 07:02:52 crc kubenswrapper[5127]: I0201 07:02:52.512665 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dfbbd2e31730f7f63f1199cf706a7b6ae3df6b5fbf661bad040e17c67bde659" Feb 01 07:02:52 crc kubenswrapper[5127]: I0201 07:02:52.512719 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.868207 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-65f7457996-dlrps"] Feb 01 07:03:00 crc kubenswrapper[5127]: E0201 07:03:00.869040 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerName="pull" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.869056 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerName="pull" Feb 01 07:03:00 crc kubenswrapper[5127]: E0201 07:03:00.869071 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerName="extract" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.869079 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerName="extract" Feb 01 07:03:00 crc kubenswrapper[5127]: E0201 07:03:00.869106 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerName="util" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.869116 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerName="util" Feb 01 07:03:00 crc kubenswrapper[5127]: E0201 07:03:00.869126 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bc3500-5fb6-4d26-97dd-24dc06658294" containerName="console" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.869133 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bc3500-5fb6-4d26-97dd-24dc06658294" containerName="console" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.869246 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86fe952-2a5c-4a01-b82e-53ba47fc92c8" containerName="extract" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.869261 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bc3500-5fb6-4d26-97dd-24dc06658294" containerName="console" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.869796 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.872152 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.872384 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.872496 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.872535 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.872516 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9nn9s" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.883321 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65f7457996-dlrps"] Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.970973 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6559471d-0983-456e-9890-5997a2923dd8-apiservice-cert\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.971037 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcq2l\" (UniqueName: \"kubernetes.io/projected/6559471d-0983-456e-9890-5997a2923dd8-kube-api-access-gcq2l\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:00 crc kubenswrapper[5127]: I0201 07:03:00.971059 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6559471d-0983-456e-9890-5997a2923dd8-webhook-cert\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.072869 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6559471d-0983-456e-9890-5997a2923dd8-apiservice-cert\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.072968 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcq2l\" (UniqueName: \"kubernetes.io/projected/6559471d-0983-456e-9890-5997a2923dd8-kube-api-access-gcq2l\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.073002 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6559471d-0983-456e-9890-5997a2923dd8-webhook-cert\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.079195 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6559471d-0983-456e-9890-5997a2923dd8-apiservice-cert\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.081293 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6559471d-0983-456e-9890-5997a2923dd8-webhook-cert\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.094095 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcq2l\" (UniqueName: \"kubernetes.io/projected/6559471d-0983-456e-9890-5997a2923dd8-kube-api-access-gcq2l\") pod \"metallb-operator-controller-manager-65f7457996-dlrps\" (UID: \"6559471d-0983-456e-9890-5997a2923dd8\") " pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.098016 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd"] Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.098819 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.101904 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.102310 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.102420 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2dtbv" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.118906 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd"] Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.174299 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f1c209c-f9db-4d28-a248-dfd64c611455-apiservice-cert\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.174427 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f1c209c-f9db-4d28-a248-dfd64c611455-webhook-cert\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.174475 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvgvs\" (UniqueName: \"kubernetes.io/projected/3f1c209c-f9db-4d28-a248-dfd64c611455-kube-api-access-lvgvs\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.188101 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.277373 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f1c209c-f9db-4d28-a248-dfd64c611455-webhook-cert\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.277636 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvgvs\" (UniqueName: \"kubernetes.io/projected/3f1c209c-f9db-4d28-a248-dfd64c611455-kube-api-access-lvgvs\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.277655 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f1c209c-f9db-4d28-a248-dfd64c611455-apiservice-cert\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.281551 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f1c209c-f9db-4d28-a248-dfd64c611455-webhook-cert\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.290549 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f1c209c-f9db-4d28-a248-dfd64c611455-apiservice-cert\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.308319 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvgvs\" (UniqueName: \"kubernetes.io/projected/3f1c209c-f9db-4d28-a248-dfd64c611455-kube-api-access-lvgvs\") pod \"metallb-operator-webhook-server-74f77b7fdf-vs6dd\" (UID: \"3f1c209c-f9db-4d28-a248-dfd64c611455\") " pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.444648 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.445688 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65f7457996-dlrps"] Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.565019 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" event={"ID":"6559471d-0983-456e-9890-5997a2923dd8","Type":"ContainerStarted","Data":"8ccb137e01eab71f55b400a03251aeb4db9db5342a55a79d4d4d5fc365b243c1"} Feb 01 07:03:01 crc kubenswrapper[5127]: I0201 07:03:01.862510 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd"] Feb 01 07:03:01 crc kubenswrapper[5127]: W0201 07:03:01.884759 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f1c209c_f9db_4d28_a248_dfd64c611455.slice/crio-34cdc6fb7d9296e5119a55b68462f8a376d9159a1212447211c2dbafdd87eb7a WatchSource:0}: Error finding container 34cdc6fb7d9296e5119a55b68462f8a376d9159a1212447211c2dbafdd87eb7a: Status 404 returned error can't find the container with id 34cdc6fb7d9296e5119a55b68462f8a376d9159a1212447211c2dbafdd87eb7a Feb 01 07:03:02 crc kubenswrapper[5127]: I0201 07:03:02.575116 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" event={"ID":"3f1c209c-f9db-4d28-a248-dfd64c611455","Type":"ContainerStarted","Data":"34cdc6fb7d9296e5119a55b68462f8a376d9159a1212447211c2dbafdd87eb7a"} Feb 01 07:03:04 crc kubenswrapper[5127]: I0201 07:03:04.592733 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" event={"ID":"6559471d-0983-456e-9890-5997a2923dd8","Type":"ContainerStarted","Data":"bc22c898e1fc39eb5715e702211fbac95e4d7274b1611d9851bdc7a40be233b9"} Feb 01 07:03:04 crc kubenswrapper[5127]: I0201 07:03:04.593318 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:04 crc kubenswrapper[5127]: I0201 07:03:04.628946 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" podStartSLOduration=1.869893775 podStartE2EDuration="4.628926237s" podCreationTimestamp="2026-02-01 07:03:00 +0000 UTC" firstStartedPulling="2026-02-01 07:03:01.468404173 +0000 UTC m=+931.954306536" lastFinishedPulling="2026-02-01 07:03:04.227436635 +0000 UTC m=+934.713338998" observedRunningTime="2026-02-01 07:03:04.623939722 +0000 UTC m=+935.109842085" watchObservedRunningTime="2026-02-01 07:03:04.628926237 +0000 UTC m=+935.114828610" Feb 01 07:03:06 crc kubenswrapper[5127]: I0201 07:03:06.607776 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" event={"ID":"3f1c209c-f9db-4d28-a248-dfd64c611455","Type":"ContainerStarted","Data":"48e30ccbd7c18b9b9dd945a3613baa7d08ff6b768c0df5c5fc48c04fd8c38e62"} Feb 01 07:03:06 crc kubenswrapper[5127]: I0201 07:03:06.608136 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:06 crc kubenswrapper[5127]: I0201 07:03:06.627993 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" podStartSLOduration=1.719623696 podStartE2EDuration="5.62797933s" podCreationTimestamp="2026-02-01 07:03:01 +0000 UTC" firstStartedPulling="2026-02-01 07:03:01.887982965 +0000 UTC m=+932.373885328" lastFinishedPulling="2026-02-01 07:03:05.796338609 +0000 UTC m=+936.282240962" observedRunningTime="2026-02-01 07:03:06.626250782 +0000 UTC m=+937.112153155" watchObservedRunningTime="2026-02-01 07:03:06.62797933 +0000 UTC m=+937.113881693" Feb 01 07:03:21 crc kubenswrapper[5127]: I0201 07:03:21.451773 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74f77b7fdf-vs6dd" Feb 01 07:03:41 crc kubenswrapper[5127]: I0201 07:03:41.192512 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-65f7457996-dlrps" Feb 01 07:03:41 crc kubenswrapper[5127]: I0201 07:03:41.970161 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r"] Feb 01 07:03:41 crc kubenswrapper[5127]: I0201 07:03:41.971620 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:41 crc kubenswrapper[5127]: I0201 07:03:41.975264 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-skqf9" Feb 01 07:03:41 crc kubenswrapper[5127]: I0201 07:03:41.975527 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 01 07:03:41 crc kubenswrapper[5127]: I0201 07:03:41.981409 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vpnfg"] Feb 01 07:03:41 crc kubenswrapper[5127]: I0201 07:03:41.992675 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.001870 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.002094 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.003399 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r"] Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.066066 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5gpqk"] Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.068383 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.075563 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.076378 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.076785 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ns6jb" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.076973 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.084228 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-wgvph"] Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.085709 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.087848 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.093417 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-wgvph"] Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141506 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-reloader\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141559 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnjxl\" (UniqueName: \"kubernetes.io/projected/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-kube-api-access-jnjxl\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141754 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141774 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86z8s\" (UniqueName: \"kubernetes.io/projected/a116c9d4-3422-4263-83ab-dd00009d9603-kube-api-access-86z8s\") pod \"frr-k8s-webhook-server-7df86c4f6c-x4g6r\" (UID: \"a116c9d4-3422-4263-83ab-dd00009d9603\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141840 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-sockets\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141866 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics-certs\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141885 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-conf\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141914 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-startup\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.141929 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a116c9d4-3422-4263-83ab-dd00009d9603-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x4g6r\" (UID: \"a116c9d4-3422-4263-83ab-dd00009d9603\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243213 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-reloader\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243640 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-reloader\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243702 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnjxl\" (UniqueName: \"kubernetes.io/projected/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-kube-api-access-jnjxl\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243750 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/516e16c5-a825-4ea6-a093-91b77dedc874-metrics-certs\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243771 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/516e16c5-a825-4ea6-a093-91b77dedc874-cert\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243789 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-memberlist\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243810 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243832 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86z8s\" (UniqueName: \"kubernetes.io/projected/a116c9d4-3422-4263-83ab-dd00009d9603-kube-api-access-86z8s\") pod \"frr-k8s-webhook-server-7df86c4f6c-x4g6r\" (UID: \"a116c9d4-3422-4263-83ab-dd00009d9603\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243865 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-sockets\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243888 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics-certs\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243908 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-conf\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243943 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-startup\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243963 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a116c9d4-3422-4263-83ab-dd00009d9603-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x4g6r\" (UID: \"a116c9d4-3422-4263-83ab-dd00009d9603\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.243986 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kfpx\" (UniqueName: \"kubernetes.io/projected/516e16c5-a825-4ea6-a093-91b77dedc874-kube-api-access-5kfpx\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.244008 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/438513cf-4480-46c2-b82e-ca515e475e06-metallb-excludel2\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.244031 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-metrics-certs\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.244057 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49g9\" (UniqueName: \"kubernetes.io/projected/438513cf-4480-46c2-b82e-ca515e475e06-kube-api-access-k49g9\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.244855 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.245230 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-sockets\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: E0201 07:03:42.245308 5127 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 01 07:03:42 crc kubenswrapper[5127]: E0201 07:03:42.245355 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics-certs podName:1db2b7d1-a38e-4b50-8b1e-a30a5d59608a nodeName:}" failed. No retries permitted until 2026-02-01 07:03:42.745338408 +0000 UTC m=+973.231240771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics-certs") pod "frr-k8s-vpnfg" (UID: "1db2b7d1-a38e-4b50-8b1e-a30a5d59608a") : secret "frr-k8s-certs-secret" not found Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.245746 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-conf\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.246547 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-frr-startup\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.259798 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a116c9d4-3422-4263-83ab-dd00009d9603-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-x4g6r\" (UID: \"a116c9d4-3422-4263-83ab-dd00009d9603\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.266713 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnjxl\" (UniqueName: \"kubernetes.io/projected/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-kube-api-access-jnjxl\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.330598 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86z8s\" (UniqueName: \"kubernetes.io/projected/a116c9d4-3422-4263-83ab-dd00009d9603-kube-api-access-86z8s\") pod \"frr-k8s-webhook-server-7df86c4f6c-x4g6r\" (UID: \"a116c9d4-3422-4263-83ab-dd00009d9603\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.344943 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/516e16c5-a825-4ea6-a093-91b77dedc874-metrics-certs\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.344997 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/516e16c5-a825-4ea6-a093-91b77dedc874-cert\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.345021 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-memberlist\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.345100 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kfpx\" (UniqueName: \"kubernetes.io/projected/516e16c5-a825-4ea6-a093-91b77dedc874-kube-api-access-5kfpx\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.345127 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/438513cf-4480-46c2-b82e-ca515e475e06-metallb-excludel2\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.345158 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-metrics-certs\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.345184 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49g9\" (UniqueName: \"kubernetes.io/projected/438513cf-4480-46c2-b82e-ca515e475e06-kube-api-access-k49g9\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: E0201 07:03:42.345270 5127 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 01 07:03:42 crc kubenswrapper[5127]: E0201 07:03:42.345354 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-memberlist podName:438513cf-4480-46c2-b82e-ca515e475e06 nodeName:}" failed. No retries permitted until 2026-02-01 07:03:42.845331566 +0000 UTC m=+973.331233979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-memberlist") pod "speaker-5gpqk" (UID: "438513cf-4480-46c2-b82e-ca515e475e06") : secret "metallb-memberlist" not found Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.345976 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/438513cf-4480-46c2-b82e-ca515e475e06-metallb-excludel2\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.348635 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/516e16c5-a825-4ea6-a093-91b77dedc874-metrics-certs\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.350111 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-metrics-certs\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.350407 5127 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.359904 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/516e16c5-a825-4ea6-a093-91b77dedc874-cert\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.367257 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kfpx\" (UniqueName: \"kubernetes.io/projected/516e16c5-a825-4ea6-a093-91b77dedc874-kube-api-access-5kfpx\") pod \"controller-6968d8fdc4-wgvph\" (UID: \"516e16c5-a825-4ea6-a093-91b77dedc874\") " pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.386387 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49g9\" (UniqueName: \"kubernetes.io/projected/438513cf-4480-46c2-b82e-ca515e475e06-kube-api-access-k49g9\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.403084 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.589551 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-wgvph"] Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.609166 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.748866 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics-certs\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.755187 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1db2b7d1-a38e-4b50-8b1e-a30a5d59608a-metrics-certs\") pod \"frr-k8s-vpnfg\" (UID: \"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a\") " pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.841158 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r"] Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.847179 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wgvph" event={"ID":"516e16c5-a825-4ea6-a093-91b77dedc874","Type":"ContainerStarted","Data":"5067f21f130fc0980b7a2d30f3e63db6f924bdba655f24d8f1f8c6deb3d5d243"} Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.847463 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wgvph" event={"ID":"516e16c5-a825-4ea6-a093-91b77dedc874","Type":"ContainerStarted","Data":"8096590cae7bc1274bab0bbcf5b86e6014dbb52df54bf2a1d93f9b2c2e65eb0a"} Feb 01 07:03:42 crc kubenswrapper[5127]: W0201 07:03:42.847901 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda116c9d4_3422_4263_83ab_dd00009d9603.slice/crio-911efba8f1517be4984b7cf9f4119d4009e77c5c60adc334d88f539d32981834 WatchSource:0}: Error finding container 911efba8f1517be4984b7cf9f4119d4009e77c5c60adc334d88f539d32981834: Status 404 returned error can't find the container with id 911efba8f1517be4984b7cf9f4119d4009e77c5c60adc334d88f539d32981834 Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.849887 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-memberlist\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.853157 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/438513cf-4480-46c2-b82e-ca515e475e06-memberlist\") pod \"speaker-5gpqk\" (UID: \"438513cf-4480-46c2-b82e-ca515e475e06\") " pod="metallb-system/speaker-5gpqk" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.925874 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:42 crc kubenswrapper[5127]: I0201 07:03:42.983999 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5gpqk" Feb 01 07:03:43 crc kubenswrapper[5127]: W0201 07:03:43.008468 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod438513cf_4480_46c2_b82e_ca515e475e06.slice/crio-3291dfa09b148830be83aa22f91b01338c45d54546180c30e914aece01fc0270 WatchSource:0}: Error finding container 3291dfa09b148830be83aa22f91b01338c45d54546180c30e914aece01fc0270: Status 404 returned error can't find the container with id 3291dfa09b148830be83aa22f91b01338c45d54546180c30e914aece01fc0270 Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.855511 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerStarted","Data":"2d42517ddca35b9b9bcc84aea18fcca9ca8b5e616d4ed22f0556f86601845394"} Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.858037 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-wgvph" event={"ID":"516e16c5-a825-4ea6-a093-91b77dedc874","Type":"ContainerStarted","Data":"b5d5962ae7a14649eac637af74d02c24f63ed9ca34937a0e9667b3b0f0326eac"} Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.858131 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.859279 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5gpqk" event={"ID":"438513cf-4480-46c2-b82e-ca515e475e06","Type":"ContainerStarted","Data":"4edf9b90efdf17918bbbfadecc0c32c0bf10439978980297aa7f558212372609"} Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.859309 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5gpqk" event={"ID":"438513cf-4480-46c2-b82e-ca515e475e06","Type":"ContainerStarted","Data":"783ebcf263002d05295823823b6a9bfcb5a6579a72e1de59f4ff2ef7838d3d09"} Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.859320 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5gpqk" event={"ID":"438513cf-4480-46c2-b82e-ca515e475e06","Type":"ContainerStarted","Data":"3291dfa09b148830be83aa22f91b01338c45d54546180c30e914aece01fc0270"} Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.859434 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5gpqk" Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.861618 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" event={"ID":"a116c9d4-3422-4263-83ab-dd00009d9603","Type":"ContainerStarted","Data":"911efba8f1517be4984b7cf9f4119d4009e77c5c60adc334d88f539d32981834"} Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.882394 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-wgvph" podStartSLOduration=1.882373937 podStartE2EDuration="1.882373937s" podCreationTimestamp="2026-02-01 07:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:03:43.878035099 +0000 UTC m=+974.363937482" watchObservedRunningTime="2026-02-01 07:03:43.882373937 +0000 UTC m=+974.368276300" Feb 01 07:03:43 crc kubenswrapper[5127]: I0201 07:03:43.897658 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5gpqk" podStartSLOduration=1.89763532 podStartE2EDuration="1.89763532s" podCreationTimestamp="2026-02-01 07:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:03:43.893906359 +0000 UTC m=+974.379808742" watchObservedRunningTime="2026-02-01 07:03:43.89763532 +0000 UTC m=+974.383537683" Feb 01 07:03:49 crc kubenswrapper[5127]: I0201 07:03:49.910134 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" event={"ID":"a116c9d4-3422-4263-83ab-dd00009d9603","Type":"ContainerStarted","Data":"7249f7668cdf4193553b66c867aaed9c996238a4a614a21a48ced18049e0e90c"} Feb 01 07:03:49 crc kubenswrapper[5127]: I0201 07:03:49.910909 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:03:49 crc kubenswrapper[5127]: I0201 07:03:49.914161 5127 generic.go:334] "Generic (PLEG): container finished" podID="1db2b7d1-a38e-4b50-8b1e-a30a5d59608a" containerID="40c8dc9868695a659649c547e47820cb8e16c8ffd00d76c49b674e4c1310fbff" exitCode=0 Feb 01 07:03:49 crc kubenswrapper[5127]: I0201 07:03:49.914212 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerDied","Data":"40c8dc9868695a659649c547e47820cb8e16c8ffd00d76c49b674e4c1310fbff"} Feb 01 07:03:49 crc kubenswrapper[5127]: I0201 07:03:49.982765 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" podStartSLOduration=2.3235704139999998 podStartE2EDuration="8.982730378s" podCreationTimestamp="2026-02-01 07:03:41 +0000 UTC" firstStartedPulling="2026-02-01 07:03:42.851187083 +0000 UTC m=+973.337089446" lastFinishedPulling="2026-02-01 07:03:49.510347047 +0000 UTC m=+979.996249410" observedRunningTime="2026-02-01 07:03:49.941889433 +0000 UTC m=+980.427791876" watchObservedRunningTime="2026-02-01 07:03:49.982730378 +0000 UTC m=+980.468632791" Feb 01 07:03:50 crc kubenswrapper[5127]: I0201 07:03:50.922619 5127 generic.go:334] "Generic (PLEG): container finished" podID="1db2b7d1-a38e-4b50-8b1e-a30a5d59608a" containerID="630c065423ca930c0a622ea6b2b6f0bf28356cf879a1d76014658e1203db6f3e" exitCode=0 Feb 01 07:03:50 crc kubenswrapper[5127]: I0201 07:03:50.922660 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerDied","Data":"630c065423ca930c0a622ea6b2b6f0bf28356cf879a1d76014658e1203db6f3e"} Feb 01 07:03:51 crc kubenswrapper[5127]: I0201 07:03:51.933437 5127 generic.go:334] "Generic (PLEG): container finished" podID="1db2b7d1-a38e-4b50-8b1e-a30a5d59608a" containerID="0bd72bbfbc05b6d4fa4c69f52aa61ab44ac74f9619fb45b6512af1efd5998ef7" exitCode=0 Feb 01 07:03:51 crc kubenswrapper[5127]: I0201 07:03:51.933526 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerDied","Data":"0bd72bbfbc05b6d4fa4c69f52aa61ab44ac74f9619fb45b6512af1efd5998ef7"} Feb 01 07:03:52 crc kubenswrapper[5127]: I0201 07:03:52.409875 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-wgvph" Feb 01 07:03:52 crc kubenswrapper[5127]: I0201 07:03:52.942137 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerStarted","Data":"e79ecaa25b059d139be356602221991c4fbf58116864eb696e37954f754d186e"} Feb 01 07:03:52 crc kubenswrapper[5127]: I0201 07:03:52.942453 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerStarted","Data":"5720c20334e3b751096e5b98f57b90abde85d42d31a9ab86b29dd0a1ad059732"} Feb 01 07:03:52 crc kubenswrapper[5127]: I0201 07:03:52.942462 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerStarted","Data":"5433222c7f4a77a2ea8fb38b359fbc1cae6625092190b8fe0a6e671849d44460"} Feb 01 07:03:52 crc kubenswrapper[5127]: I0201 07:03:52.942470 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerStarted","Data":"2f5c0f499b96cee3aada3adab68f8325ca4a47553e3baca7a4b55ab7ccbf57f8"} Feb 01 07:03:52 crc kubenswrapper[5127]: I0201 07:03:52.942477 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerStarted","Data":"5513aea516ee855b4669b43ab4ecb6d252d42bf3304b522053b15ea896ba2cac"} Feb 01 07:03:53 crc kubenswrapper[5127]: I0201 07:03:53.960829 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vpnfg" event={"ID":"1db2b7d1-a38e-4b50-8b1e-a30a5d59608a","Type":"ContainerStarted","Data":"6010f8e9254b172f30e44ed243a8ca71cb2418a6a90b2f9a445a9d7c0d623319"} Feb 01 07:03:53 crc kubenswrapper[5127]: I0201 07:03:53.961748 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:53 crc kubenswrapper[5127]: I0201 07:03:53.995726 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vpnfg" podStartSLOduration=6.484033425 podStartE2EDuration="12.995706095s" podCreationTimestamp="2026-02-01 07:03:41 +0000 UTC" firstStartedPulling="2026-02-01 07:03:43.027780165 +0000 UTC m=+973.513682528" lastFinishedPulling="2026-02-01 07:03:49.539452805 +0000 UTC m=+980.025355198" observedRunningTime="2026-02-01 07:03:53.992182979 +0000 UTC m=+984.478085392" watchObservedRunningTime="2026-02-01 07:03:53.995706095 +0000 UTC m=+984.481608488" Feb 01 07:03:57 crc kubenswrapper[5127]: I0201 07:03:57.927028 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:03:57 crc kubenswrapper[5127]: I0201 07:03:57.970561 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:04:02 crc kubenswrapper[5127]: I0201 07:04:02.613207 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-x4g6r" Feb 01 07:04:02 crc kubenswrapper[5127]: I0201 07:04:02.931572 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vpnfg" Feb 01 07:04:02 crc kubenswrapper[5127]: I0201 07:04:02.987464 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5gpqk" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.438480 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c"] Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.443389 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.445503 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.448122 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c"] Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.480433 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.480505 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtq9l\" (UniqueName: \"kubernetes.io/projected/b6b15690-b95f-417f-956b-78ad11c53bb2-kube-api-access-rtq9l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.480550 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.581301 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.581400 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.581437 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtq9l\" (UniqueName: \"kubernetes.io/projected/b6b15690-b95f-417f-956b-78ad11c53bb2-kube-api-access-rtq9l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.582027 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.582003 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.604062 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtq9l\" (UniqueName: \"kubernetes.io/projected/b6b15690-b95f-417f-956b-78ad11c53bb2-kube-api-access-rtq9l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:04 crc kubenswrapper[5127]: I0201 07:04:04.767337 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:05 crc kubenswrapper[5127]: I0201 07:04:05.056943 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c"] Feb 01 07:04:06 crc kubenswrapper[5127]: I0201 07:04:06.041222 5127 generic.go:334] "Generic (PLEG): container finished" podID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerID="c716b365c2cc37dbcaa6397570a5f5867e75695f56fcc53a651ed5e88e6ccde9" exitCode=0 Feb 01 07:04:06 crc kubenswrapper[5127]: I0201 07:04:06.041310 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" event={"ID":"b6b15690-b95f-417f-956b-78ad11c53bb2","Type":"ContainerDied","Data":"c716b365c2cc37dbcaa6397570a5f5867e75695f56fcc53a651ed5e88e6ccde9"} Feb 01 07:04:06 crc kubenswrapper[5127]: I0201 07:04:06.041523 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" event={"ID":"b6b15690-b95f-417f-956b-78ad11c53bb2","Type":"ContainerStarted","Data":"c615d482dca5f116f19b61bb9be7f22aa8af48d755cf59d6f2cf7ef93e14647a"} Feb 01 07:04:06 crc kubenswrapper[5127]: I0201 07:04:06.741536 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:04:06 crc kubenswrapper[5127]: I0201 07:04:06.741630 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:04:10 crc kubenswrapper[5127]: I0201 07:04:10.070541 5127 generic.go:334] "Generic (PLEG): container finished" podID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerID="5bd9094d067d00fc89ed6d5f15fa5632af470aba2569d528f24753f04c6775ba" exitCode=0 Feb 01 07:04:10 crc kubenswrapper[5127]: I0201 07:04:10.070660 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" event={"ID":"b6b15690-b95f-417f-956b-78ad11c53bb2","Type":"ContainerDied","Data":"5bd9094d067d00fc89ed6d5f15fa5632af470aba2569d528f24753f04c6775ba"} Feb 01 07:04:11 crc kubenswrapper[5127]: I0201 07:04:11.081990 5127 generic.go:334] "Generic (PLEG): container finished" podID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerID="ef74ca9c58ac27b22ec3ca15f07f09a60407b50c277580f293c24a753ed31752" exitCode=0 Feb 01 07:04:11 crc kubenswrapper[5127]: I0201 07:04:11.082039 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" event={"ID":"b6b15690-b95f-417f-956b-78ad11c53bb2","Type":"ContainerDied","Data":"ef74ca9c58ac27b22ec3ca15f07f09a60407b50c277580f293c24a753ed31752"} Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.441674 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.526193 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtq9l\" (UniqueName: \"kubernetes.io/projected/b6b15690-b95f-417f-956b-78ad11c53bb2-kube-api-access-rtq9l\") pod \"b6b15690-b95f-417f-956b-78ad11c53bb2\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.526260 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-bundle\") pod \"b6b15690-b95f-417f-956b-78ad11c53bb2\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.526306 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-util\") pod \"b6b15690-b95f-417f-956b-78ad11c53bb2\" (UID: \"b6b15690-b95f-417f-956b-78ad11c53bb2\") " Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.528098 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-bundle" (OuterVolumeSpecName: "bundle") pod "b6b15690-b95f-417f-956b-78ad11c53bb2" (UID: "b6b15690-b95f-417f-956b-78ad11c53bb2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.531109 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b15690-b95f-417f-956b-78ad11c53bb2-kube-api-access-rtq9l" (OuterVolumeSpecName: "kube-api-access-rtq9l") pod "b6b15690-b95f-417f-956b-78ad11c53bb2" (UID: "b6b15690-b95f-417f-956b-78ad11c53bb2"). InnerVolumeSpecName "kube-api-access-rtq9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.552109 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-util" (OuterVolumeSpecName: "util") pod "b6b15690-b95f-417f-956b-78ad11c53bb2" (UID: "b6b15690-b95f-417f-956b-78ad11c53bb2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.627845 5127 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-util\") on node \"crc\" DevicePath \"\"" Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.627874 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtq9l\" (UniqueName: \"kubernetes.io/projected/b6b15690-b95f-417f-956b-78ad11c53bb2-kube-api-access-rtq9l\") on node \"crc\" DevicePath \"\"" Feb 01 07:04:12 crc kubenswrapper[5127]: I0201 07:04:12.627884 5127 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b6b15690-b95f-417f-956b-78ad11c53bb2-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:04:13 crc kubenswrapper[5127]: I0201 07:04:13.097674 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" event={"ID":"b6b15690-b95f-417f-956b-78ad11c53bb2","Type":"ContainerDied","Data":"c615d482dca5f116f19b61bb9be7f22aa8af48d755cf59d6f2cf7ef93e14647a"} Feb 01 07:04:13 crc kubenswrapper[5127]: I0201 07:04:13.097724 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c615d482dca5f116f19b61bb9be7f22aa8af48d755cf59d6f2cf7ef93e14647a" Feb 01 07:04:13 crc kubenswrapper[5127]: I0201 07:04:13.097737 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.334032 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr"] Feb 01 07:04:17 crc kubenswrapper[5127]: E0201 07:04:17.335049 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerName="extract" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.335067 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerName="extract" Feb 01 07:04:17 crc kubenswrapper[5127]: E0201 07:04:17.335081 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerName="util" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.335087 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerName="util" Feb 01 07:04:17 crc kubenswrapper[5127]: E0201 07:04:17.335109 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerName="pull" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.335116 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerName="pull" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.335231 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b15690-b95f-417f-956b-78ad11c53bb2" containerName="extract" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.335789 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.338105 5127 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-x9pkc" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.338249 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.339445 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.351977 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr"] Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.490981 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46478dab-f50e-4529-a064-fe4fd39161dc-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-grnnr\" (UID: \"46478dab-f50e-4529-a064-fe4fd39161dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.491145 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdvmj\" (UniqueName: \"kubernetes.io/projected/46478dab-f50e-4529-a064-fe4fd39161dc-kube-api-access-gdvmj\") pod \"cert-manager-operator-controller-manager-66c8bdd694-grnnr\" (UID: \"46478dab-f50e-4529-a064-fe4fd39161dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.592622 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46478dab-f50e-4529-a064-fe4fd39161dc-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-grnnr\" (UID: \"46478dab-f50e-4529-a064-fe4fd39161dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.592724 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdvmj\" (UniqueName: \"kubernetes.io/projected/46478dab-f50e-4529-a064-fe4fd39161dc-kube-api-access-gdvmj\") pod \"cert-manager-operator-controller-manager-66c8bdd694-grnnr\" (UID: \"46478dab-f50e-4529-a064-fe4fd39161dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.593137 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46478dab-f50e-4529-a064-fe4fd39161dc-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-grnnr\" (UID: \"46478dab-f50e-4529-a064-fe4fd39161dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.609309 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdvmj\" (UniqueName: \"kubernetes.io/projected/46478dab-f50e-4529-a064-fe4fd39161dc-kube-api-access-gdvmj\") pod \"cert-manager-operator-controller-manager-66c8bdd694-grnnr\" (UID: \"46478dab-f50e-4529-a064-fe4fd39161dc\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:17 crc kubenswrapper[5127]: I0201 07:04:17.657060 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" Feb 01 07:04:18 crc kubenswrapper[5127]: I0201 07:04:18.112333 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr"] Feb 01 07:04:18 crc kubenswrapper[5127]: I0201 07:04:18.135963 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" event={"ID":"46478dab-f50e-4529-a064-fe4fd39161dc","Type":"ContainerStarted","Data":"a378b07083a7852346c58c4d9a597a098d54f5e948ac92616ec5622102001112"} Feb 01 07:04:21 crc kubenswrapper[5127]: I0201 07:04:21.153340 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" event={"ID":"46478dab-f50e-4529-a064-fe4fd39161dc","Type":"ContainerStarted","Data":"4f2bee6ca18b450d908e6780378e9df1e165e2117ef46461b87415ec7029b88a"} Feb 01 07:04:26 crc kubenswrapper[5127]: I0201 07:04:26.911416 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-grnnr" podStartSLOduration=7.516306668 podStartE2EDuration="9.911380704s" podCreationTimestamp="2026-02-01 07:04:17 +0000 UTC" firstStartedPulling="2026-02-01 07:04:18.128865113 +0000 UTC m=+1008.614767466" lastFinishedPulling="2026-02-01 07:04:20.523939109 +0000 UTC m=+1011.009841502" observedRunningTime="2026-02-01 07:04:21.18498758 +0000 UTC m=+1011.670889933" watchObservedRunningTime="2026-02-01 07:04:26.911380704 +0000 UTC m=+1017.397283117" Feb 01 07:04:26 crc kubenswrapper[5127]: I0201 07:04:26.918006 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-dhn4h"] Feb 01 07:04:26 crc kubenswrapper[5127]: I0201 07:04:26.919700 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:26 crc kubenswrapper[5127]: I0201 07:04:26.922018 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 01 07:04:26 crc kubenswrapper[5127]: I0201 07:04:26.922405 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 01 07:04:26 crc kubenswrapper[5127]: I0201 07:04:26.922515 5127 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8kl7f" Feb 01 07:04:26 crc kubenswrapper[5127]: I0201 07:04:26.944043 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-dhn4h"] Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.027089 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2eca325f-b927-4e7b-8500-875f68594b7e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-dhn4h\" (UID: \"2eca325f-b927-4e7b-8500-875f68594b7e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.027343 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr6gl\" (UniqueName: \"kubernetes.io/projected/2eca325f-b927-4e7b-8500-875f68594b7e-kube-api-access-wr6gl\") pod \"cert-manager-cainjector-5545bd876-dhn4h\" (UID: \"2eca325f-b927-4e7b-8500-875f68594b7e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.128486 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr6gl\" (UniqueName: \"kubernetes.io/projected/2eca325f-b927-4e7b-8500-875f68594b7e-kube-api-access-wr6gl\") pod \"cert-manager-cainjector-5545bd876-dhn4h\" (UID: \"2eca325f-b927-4e7b-8500-875f68594b7e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.128687 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2eca325f-b927-4e7b-8500-875f68594b7e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-dhn4h\" (UID: \"2eca325f-b927-4e7b-8500-875f68594b7e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.154261 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2eca325f-b927-4e7b-8500-875f68594b7e-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-dhn4h\" (UID: \"2eca325f-b927-4e7b-8500-875f68594b7e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.156516 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr6gl\" (UniqueName: \"kubernetes.io/projected/2eca325f-b927-4e7b-8500-875f68594b7e-kube-api-access-wr6gl\") pod \"cert-manager-cainjector-5545bd876-dhn4h\" (UID: \"2eca325f-b927-4e7b-8500-875f68594b7e\") " pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.266598 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" Feb 01 07:04:27 crc kubenswrapper[5127]: I0201 07:04:27.510731 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-dhn4h"] Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.052606 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-7nfk8"] Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.053702 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.059667 5127 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sksf8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.070496 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-7nfk8"] Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.142564 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfhwn\" (UniqueName: \"kubernetes.io/projected/42257850-8b63-4bc9-a884-ead44b084bf1-kube-api-access-mfhwn\") pod \"cert-manager-webhook-6888856db4-7nfk8\" (UID: \"42257850-8b63-4bc9-a884-ead44b084bf1\") " pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.142649 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42257850-8b63-4bc9-a884-ead44b084bf1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-7nfk8\" (UID: \"42257850-8b63-4bc9-a884-ead44b084bf1\") " pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.195440 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" event={"ID":"2eca325f-b927-4e7b-8500-875f68594b7e","Type":"ContainerStarted","Data":"3b9fb39da8cbc158fbcbe353b134d80bfe4d25fbf967f9762415996c6550011e"} Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.244338 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfhwn\" (UniqueName: \"kubernetes.io/projected/42257850-8b63-4bc9-a884-ead44b084bf1-kube-api-access-mfhwn\") pod \"cert-manager-webhook-6888856db4-7nfk8\" (UID: \"42257850-8b63-4bc9-a884-ead44b084bf1\") " pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.244420 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42257850-8b63-4bc9-a884-ead44b084bf1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-7nfk8\" (UID: \"42257850-8b63-4bc9-a884-ead44b084bf1\") " pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.269666 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42257850-8b63-4bc9-a884-ead44b084bf1-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-7nfk8\" (UID: \"42257850-8b63-4bc9-a884-ead44b084bf1\") " pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.278435 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfhwn\" (UniqueName: \"kubernetes.io/projected/42257850-8b63-4bc9-a884-ead44b084bf1-kube-api-access-mfhwn\") pod \"cert-manager-webhook-6888856db4-7nfk8\" (UID: \"42257850-8b63-4bc9-a884-ead44b084bf1\") " pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.375978 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:28 crc kubenswrapper[5127]: I0201 07:04:28.805982 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-7nfk8"] Feb 01 07:04:28 crc kubenswrapper[5127]: W0201 07:04:28.826363 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42257850_8b63_4bc9_a884_ead44b084bf1.slice/crio-6d712e2a81a92de7f834c3100e3c46614078dd4d7e96ecfd3158f228de7c7626 WatchSource:0}: Error finding container 6d712e2a81a92de7f834c3100e3c46614078dd4d7e96ecfd3158f228de7c7626: Status 404 returned error can't find the container with id 6d712e2a81a92de7f834c3100e3c46614078dd4d7e96ecfd3158f228de7c7626 Feb 01 07:04:29 crc kubenswrapper[5127]: I0201 07:04:29.210949 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" event={"ID":"42257850-8b63-4bc9-a884-ead44b084bf1","Type":"ContainerStarted","Data":"6d712e2a81a92de7f834c3100e3c46614078dd4d7e96ecfd3158f228de7c7626"} Feb 01 07:04:32 crc kubenswrapper[5127]: I0201 07:04:32.233093 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" event={"ID":"42257850-8b63-4bc9-a884-ead44b084bf1","Type":"ContainerStarted","Data":"ba67f863141c72c9df2486ef3f9203a110fcb4f8c3b08ff1760d6f48cbc2515c"} Feb 01 07:04:32 crc kubenswrapper[5127]: I0201 07:04:32.233630 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:32 crc kubenswrapper[5127]: I0201 07:04:32.242574 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" event={"ID":"2eca325f-b927-4e7b-8500-875f68594b7e","Type":"ContainerStarted","Data":"7e4283b829ec65eefe8925ff2bf0ebdf0e13bd9f56e951f9cd1f6a85ed01df9e"} Feb 01 07:04:32 crc kubenswrapper[5127]: I0201 07:04:32.260040 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" podStartSLOduration=1.30681373 podStartE2EDuration="4.260018639s" podCreationTimestamp="2026-02-01 07:04:28 +0000 UTC" firstStartedPulling="2026-02-01 07:04:28.829402892 +0000 UTC m=+1019.315305255" lastFinishedPulling="2026-02-01 07:04:31.782607801 +0000 UTC m=+1022.268510164" observedRunningTime="2026-02-01 07:04:32.251501599 +0000 UTC m=+1022.737403962" watchObservedRunningTime="2026-02-01 07:04:32.260018639 +0000 UTC m=+1022.745920992" Feb 01 07:04:32 crc kubenswrapper[5127]: I0201 07:04:32.272318 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-dhn4h" podStartSLOduration=2.049035451 podStartE2EDuration="6.272284381s" podCreationTimestamp="2026-02-01 07:04:26 +0000 UTC" firstStartedPulling="2026-02-01 07:04:27.52571262 +0000 UTC m=+1018.011614993" lastFinishedPulling="2026-02-01 07:04:31.74896156 +0000 UTC m=+1022.234863923" observedRunningTime="2026-02-01 07:04:32.271012177 +0000 UTC m=+1022.756914540" watchObservedRunningTime="2026-02-01 07:04:32.272284381 +0000 UTC m=+1022.758186744" Feb 01 07:04:36 crc kubenswrapper[5127]: I0201 07:04:36.741036 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:04:36 crc kubenswrapper[5127]: I0201 07:04:36.741420 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:04:38 crc kubenswrapper[5127]: I0201 07:04:38.379454 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-7nfk8" Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.813804 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-grq4k"] Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.815530 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.819099 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-grq4k"] Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.844351 5127 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-b4ds4" Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.896151 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98hv\" (UniqueName: \"kubernetes.io/projected/bab839cb-ff86-4576-8ff7-7de82e0f6757-kube-api-access-x98hv\") pod \"cert-manager-545d4d4674-grq4k\" (UID: \"bab839cb-ff86-4576-8ff7-7de82e0f6757\") " pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.896245 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bab839cb-ff86-4576-8ff7-7de82e0f6757-bound-sa-token\") pod \"cert-manager-545d4d4674-grq4k\" (UID: \"bab839cb-ff86-4576-8ff7-7de82e0f6757\") " pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.997854 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98hv\" (UniqueName: \"kubernetes.io/projected/bab839cb-ff86-4576-8ff7-7de82e0f6757-kube-api-access-x98hv\") pod \"cert-manager-545d4d4674-grq4k\" (UID: \"bab839cb-ff86-4576-8ff7-7de82e0f6757\") " pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:43 crc kubenswrapper[5127]: I0201 07:04:43.997899 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bab839cb-ff86-4576-8ff7-7de82e0f6757-bound-sa-token\") pod \"cert-manager-545d4d4674-grq4k\" (UID: \"bab839cb-ff86-4576-8ff7-7de82e0f6757\") " pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:44 crc kubenswrapper[5127]: I0201 07:04:44.019358 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bab839cb-ff86-4576-8ff7-7de82e0f6757-bound-sa-token\") pod \"cert-manager-545d4d4674-grq4k\" (UID: \"bab839cb-ff86-4576-8ff7-7de82e0f6757\") " pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:44 crc kubenswrapper[5127]: I0201 07:04:44.023185 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98hv\" (UniqueName: \"kubernetes.io/projected/bab839cb-ff86-4576-8ff7-7de82e0f6757-kube-api-access-x98hv\") pod \"cert-manager-545d4d4674-grq4k\" (UID: \"bab839cb-ff86-4576-8ff7-7de82e0f6757\") " pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:44 crc kubenswrapper[5127]: I0201 07:04:44.164414 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-grq4k" Feb 01 07:04:44 crc kubenswrapper[5127]: I0201 07:04:44.415284 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-grq4k"] Feb 01 07:04:45 crc kubenswrapper[5127]: I0201 07:04:45.334030 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-grq4k" event={"ID":"bab839cb-ff86-4576-8ff7-7de82e0f6757","Type":"ContainerStarted","Data":"2eb0f451c74f9d8abef9dca8a18a43703958c3fdf0fd750985b09c4897585051"} Feb 01 07:04:45 crc kubenswrapper[5127]: I0201 07:04:45.334283 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-grq4k" event={"ID":"bab839cb-ff86-4576-8ff7-7de82e0f6757","Type":"ContainerStarted","Data":"25dda44cd0da05ca8f5f52b1920cf468536dd9fe5389e2ddf63816091aaec7e9"} Feb 01 07:04:45 crc kubenswrapper[5127]: I0201 07:04:45.348497 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-grq4k" podStartSLOduration=2.348476749 podStartE2EDuration="2.348476749s" podCreationTimestamp="2026-02-01 07:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:04:45.344971235 +0000 UTC m=+1035.830873588" watchObservedRunningTime="2026-02-01 07:04:45.348476749 +0000 UTC m=+1035.834379122" Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.712690 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6fbfx"] Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.714880 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6fbfx" Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.724103 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.724258 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.724826 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-smf5c" Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.743396 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6fbfx"] Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.811557 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx5w4\" (UniqueName: \"kubernetes.io/projected/a782f400-7ee6-468a-8510-03d85e8bdd09-kube-api-access-nx5w4\") pod \"openstack-operator-index-6fbfx\" (UID: \"a782f400-7ee6-468a-8510-03d85e8bdd09\") " pod="openstack-operators/openstack-operator-index-6fbfx" Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.912460 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx5w4\" (UniqueName: \"kubernetes.io/projected/a782f400-7ee6-468a-8510-03d85e8bdd09-kube-api-access-nx5w4\") pod \"openstack-operator-index-6fbfx\" (UID: \"a782f400-7ee6-468a-8510-03d85e8bdd09\") " pod="openstack-operators/openstack-operator-index-6fbfx" Feb 01 07:04:51 crc kubenswrapper[5127]: I0201 07:04:51.939105 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx5w4\" (UniqueName: \"kubernetes.io/projected/a782f400-7ee6-468a-8510-03d85e8bdd09-kube-api-access-nx5w4\") pod \"openstack-operator-index-6fbfx\" (UID: \"a782f400-7ee6-468a-8510-03d85e8bdd09\") " pod="openstack-operators/openstack-operator-index-6fbfx" Feb 01 07:04:52 crc kubenswrapper[5127]: I0201 07:04:52.033530 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6fbfx" Feb 01 07:04:52 crc kubenswrapper[5127]: I0201 07:04:52.493029 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6fbfx"] Feb 01 07:04:52 crc kubenswrapper[5127]: W0201 07:04:52.502674 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda782f400_7ee6_468a_8510_03d85e8bdd09.slice/crio-133b6ea106742186a3b0919aef95398c1acba3ffb804049ef2870c977c4f4990 WatchSource:0}: Error finding container 133b6ea106742186a3b0919aef95398c1acba3ffb804049ef2870c977c4f4990: Status 404 returned error can't find the container with id 133b6ea106742186a3b0919aef95398c1acba3ffb804049ef2870c977c4f4990 Feb 01 07:04:53 crc kubenswrapper[5127]: I0201 07:04:53.390357 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6fbfx" event={"ID":"a782f400-7ee6-468a-8510-03d85e8bdd09","Type":"ContainerStarted","Data":"b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53"} Feb 01 07:04:53 crc kubenswrapper[5127]: I0201 07:04:53.390402 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6fbfx" event={"ID":"a782f400-7ee6-468a-8510-03d85e8bdd09","Type":"ContainerStarted","Data":"133b6ea106742186a3b0919aef95398c1acba3ffb804049ef2870c977c4f4990"} Feb 01 07:04:53 crc kubenswrapper[5127]: I0201 07:04:53.417068 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6fbfx" podStartSLOduration=1.711384239 podStartE2EDuration="2.417039744s" podCreationTimestamp="2026-02-01 07:04:51 +0000 UTC" firstStartedPulling="2026-02-01 07:04:52.506254068 +0000 UTC m=+1042.992156441" lastFinishedPulling="2026-02-01 07:04:53.211909583 +0000 UTC m=+1043.697811946" observedRunningTime="2026-02-01 07:04:53.408629027 +0000 UTC m=+1043.894531420" watchObservedRunningTime="2026-02-01 07:04:53.417039744 +0000 UTC m=+1043.902942137" Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.080877 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6fbfx"] Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.406915 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-6fbfx" podUID="a782f400-7ee6-468a-8510-03d85e8bdd09" containerName="registry-server" containerID="cri-o://b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53" gracePeriod=2 Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.685197 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-m9nqs"] Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.686253 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.696848 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m9nqs"] Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.766101 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xrfn\" (UniqueName: \"kubernetes.io/projected/7b4c130e-a6b2-4e51-a25f-044db714852e-kube-api-access-8xrfn\") pod \"openstack-operator-index-m9nqs\" (UID: \"7b4c130e-a6b2-4e51-a25f-044db714852e\") " pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.804439 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6fbfx" Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.867446 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xrfn\" (UniqueName: \"kubernetes.io/projected/7b4c130e-a6b2-4e51-a25f-044db714852e-kube-api-access-8xrfn\") pod \"openstack-operator-index-m9nqs\" (UID: \"7b4c130e-a6b2-4e51-a25f-044db714852e\") " pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.886193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xrfn\" (UniqueName: \"kubernetes.io/projected/7b4c130e-a6b2-4e51-a25f-044db714852e-kube-api-access-8xrfn\") pod \"openstack-operator-index-m9nqs\" (UID: \"7b4c130e-a6b2-4e51-a25f-044db714852e\") " pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.968935 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx5w4\" (UniqueName: \"kubernetes.io/projected/a782f400-7ee6-468a-8510-03d85e8bdd09-kube-api-access-nx5w4\") pod \"a782f400-7ee6-468a-8510-03d85e8bdd09\" (UID: \"a782f400-7ee6-468a-8510-03d85e8bdd09\") " Feb 01 07:04:55 crc kubenswrapper[5127]: I0201 07:04:55.971635 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a782f400-7ee6-468a-8510-03d85e8bdd09-kube-api-access-nx5w4" (OuterVolumeSpecName: "kube-api-access-nx5w4") pod "a782f400-7ee6-468a-8510-03d85e8bdd09" (UID: "a782f400-7ee6-468a-8510-03d85e8bdd09"). InnerVolumeSpecName "kube-api-access-nx5w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.014857 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.071159 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx5w4\" (UniqueName: \"kubernetes.io/projected/a782f400-7ee6-468a-8510-03d85e8bdd09-kube-api-access-nx5w4\") on node \"crc\" DevicePath \"\"" Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.319536 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m9nqs"] Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.415029 5127 generic.go:334] "Generic (PLEG): container finished" podID="a782f400-7ee6-468a-8510-03d85e8bdd09" containerID="b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53" exitCode=0 Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.415123 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6fbfx" Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.415158 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6fbfx" event={"ID":"a782f400-7ee6-468a-8510-03d85e8bdd09","Type":"ContainerDied","Data":"b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53"} Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.415222 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6fbfx" event={"ID":"a782f400-7ee6-468a-8510-03d85e8bdd09","Type":"ContainerDied","Data":"133b6ea106742186a3b0919aef95398c1acba3ffb804049ef2870c977c4f4990"} Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.415242 5127 scope.go:117] "RemoveContainer" containerID="b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53" Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.416997 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m9nqs" event={"ID":"7b4c130e-a6b2-4e51-a25f-044db714852e","Type":"ContainerStarted","Data":"d2a16173d1b2e08402680a291b7558c8e0a1f068671d38884020252dfc2b23f6"} Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.434644 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6fbfx"] Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.439670 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-6fbfx"] Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.441541 5127 scope.go:117] "RemoveContainer" containerID="b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53" Feb 01 07:04:56 crc kubenswrapper[5127]: E0201 07:04:56.441931 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53\": container with ID starting with b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53 not found: ID does not exist" containerID="b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53" Feb 01 07:04:56 crc kubenswrapper[5127]: I0201 07:04:56.441959 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53"} err="failed to get container status \"b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53\": rpc error: code = NotFound desc = could not find container \"b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53\": container with ID starting with b0cc7a85502df55208cff12b46888a288836d5e605822ad519914c489b731c53 not found: ID does not exist" Feb 01 07:04:57 crc kubenswrapper[5127]: I0201 07:04:57.428007 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m9nqs" event={"ID":"7b4c130e-a6b2-4e51-a25f-044db714852e","Type":"ContainerStarted","Data":"0938fad82aee7881117f3e005a4bac114913484e7f1565cdb583527362cae4e3"} Feb 01 07:04:57 crc kubenswrapper[5127]: I0201 07:04:57.455353 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-m9nqs" podStartSLOduration=1.725983525 podStartE2EDuration="2.455329512s" podCreationTimestamp="2026-02-01 07:04:55 +0000 UTC" firstStartedPulling="2026-02-01 07:04:56.33334874 +0000 UTC m=+1046.819251123" lastFinishedPulling="2026-02-01 07:04:57.062694717 +0000 UTC m=+1047.548597110" observedRunningTime="2026-02-01 07:04:57.45006918 +0000 UTC m=+1047.935971583" watchObservedRunningTime="2026-02-01 07:04:57.455329512 +0000 UTC m=+1047.941231915" Feb 01 07:04:58 crc kubenswrapper[5127]: I0201 07:04:58.248345 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a782f400-7ee6-468a-8510-03d85e8bdd09" path="/var/lib/kubelet/pods/a782f400-7ee6-468a-8510-03d85e8bdd09/volumes" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.015980 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.016543 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.052191 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.539729 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-m9nqs" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.741093 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.741154 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.741202 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.741752 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fea23606e9e9fd1c229db27d18cd60b7a13de794804404b3c4e12726e4ef14d3"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:05:06 crc kubenswrapper[5127]: I0201 07:05:06.741818 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://fea23606e9e9fd1c229db27d18cd60b7a13de794804404b3c4e12726e4ef14d3" gracePeriod=600 Feb 01 07:05:07 crc kubenswrapper[5127]: I0201 07:05:07.506548 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="fea23606e9e9fd1c229db27d18cd60b7a13de794804404b3c4e12726e4ef14d3" exitCode=0 Feb 01 07:05:07 crc kubenswrapper[5127]: I0201 07:05:07.506632 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"fea23606e9e9fd1c229db27d18cd60b7a13de794804404b3c4e12726e4ef14d3"} Feb 01 07:05:07 crc kubenswrapper[5127]: I0201 07:05:07.506994 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"0702e12609ce38f8f96c08a0dc24be3679aca29131a880c9fa0e9bf1dfbadcf5"} Feb 01 07:05:07 crc kubenswrapper[5127]: I0201 07:05:07.507022 5127 scope.go:117] "RemoveContainer" containerID="c0e3c3031601b19c3efb39d2ef7a904dac678201b3c13a21ac60151e7cab62c7" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.427604 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz"] Feb 01 07:05:14 crc kubenswrapper[5127]: E0201 07:05:14.429893 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a782f400-7ee6-468a-8510-03d85e8bdd09" containerName="registry-server" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.430069 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a782f400-7ee6-468a-8510-03d85e8bdd09" containerName="registry-server" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.430401 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a782f400-7ee6-468a-8510-03d85e8bdd09" containerName="registry-server" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.433233 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.437460 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-77b2z" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.440028 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz"] Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.534093 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.534306 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.534458 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qdxh\" (UniqueName: \"kubernetes.io/projected/608c77ac-5d08-4c81-803d-b9aed3ba9d73-kube-api-access-9qdxh\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.635745 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.635818 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.635866 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qdxh\" (UniqueName: \"kubernetes.io/projected/608c77ac-5d08-4c81-803d-b9aed3ba9d73-kube-api-access-9qdxh\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.636477 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-util\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.636831 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-bundle\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.659565 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qdxh\" (UniqueName: \"kubernetes.io/projected/608c77ac-5d08-4c81-803d-b9aed3ba9d73-kube-api-access-9qdxh\") pod \"b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:14 crc kubenswrapper[5127]: I0201 07:05:14.754847 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:15 crc kubenswrapper[5127]: I0201 07:05:15.238622 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz"] Feb 01 07:05:15 crc kubenswrapper[5127]: I0201 07:05:15.580221 5127 generic.go:334] "Generic (PLEG): container finished" podID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerID="d48eb3d28125fd4f26285487d10244ccf97895817e24a1a4ad9399d11842bc3d" exitCode=0 Feb 01 07:05:15 crc kubenswrapper[5127]: I0201 07:05:15.580302 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" event={"ID":"608c77ac-5d08-4c81-803d-b9aed3ba9d73","Type":"ContainerDied","Data":"d48eb3d28125fd4f26285487d10244ccf97895817e24a1a4ad9399d11842bc3d"} Feb 01 07:05:15 crc kubenswrapper[5127]: I0201 07:05:15.580434 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" event={"ID":"608c77ac-5d08-4c81-803d-b9aed3ba9d73","Type":"ContainerStarted","Data":"3608498442b8cd208857c682ab04dd14f84bbf2f749a66ef6473875bbc45ba23"} Feb 01 07:05:16 crc kubenswrapper[5127]: I0201 07:05:16.591559 5127 generic.go:334] "Generic (PLEG): container finished" podID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerID="2bb6ee103e5c9a10568e853201691d185b93b03a8dab578a07f37ba3d6bae858" exitCode=0 Feb 01 07:05:16 crc kubenswrapper[5127]: I0201 07:05:16.591858 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" event={"ID":"608c77ac-5d08-4c81-803d-b9aed3ba9d73","Type":"ContainerDied","Data":"2bb6ee103e5c9a10568e853201691d185b93b03a8dab578a07f37ba3d6bae858"} Feb 01 07:05:17 crc kubenswrapper[5127]: I0201 07:05:17.603282 5127 generic.go:334] "Generic (PLEG): container finished" podID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerID="501ef2fe928d21253930f9a7821473fa309299fe8cbc489876f9e48abc604598" exitCode=0 Feb 01 07:05:17 crc kubenswrapper[5127]: I0201 07:05:17.603365 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" event={"ID":"608c77ac-5d08-4c81-803d-b9aed3ba9d73","Type":"ContainerDied","Data":"501ef2fe928d21253930f9a7821473fa309299fe8cbc489876f9e48abc604598"} Feb 01 07:05:18 crc kubenswrapper[5127]: I0201 07:05:18.999059 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.103173 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-bundle\") pod \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.103272 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-util\") pod \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.103298 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qdxh\" (UniqueName: \"kubernetes.io/projected/608c77ac-5d08-4c81-803d-b9aed3ba9d73-kube-api-access-9qdxh\") pod \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\" (UID: \"608c77ac-5d08-4c81-803d-b9aed3ba9d73\") " Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.103889 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-bundle" (OuterVolumeSpecName: "bundle") pod "608c77ac-5d08-4c81-803d-b9aed3ba9d73" (UID: "608c77ac-5d08-4c81-803d-b9aed3ba9d73"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.121743 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/608c77ac-5d08-4c81-803d-b9aed3ba9d73-kube-api-access-9qdxh" (OuterVolumeSpecName: "kube-api-access-9qdxh") pod "608c77ac-5d08-4c81-803d-b9aed3ba9d73" (UID: "608c77ac-5d08-4c81-803d-b9aed3ba9d73"). InnerVolumeSpecName "kube-api-access-9qdxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.122158 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-util" (OuterVolumeSpecName: "util") pod "608c77ac-5d08-4c81-803d-b9aed3ba9d73" (UID: "608c77ac-5d08-4c81-803d-b9aed3ba9d73"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.204475 5127 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.204866 5127 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/608c77ac-5d08-4c81-803d-b9aed3ba9d73-util\") on node \"crc\" DevicePath \"\"" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.204978 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qdxh\" (UniqueName: \"kubernetes.io/projected/608c77ac-5d08-4c81-803d-b9aed3ba9d73-kube-api-access-9qdxh\") on node \"crc\" DevicePath \"\"" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.622676 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" event={"ID":"608c77ac-5d08-4c81-803d-b9aed3ba9d73","Type":"ContainerDied","Data":"3608498442b8cd208857c682ab04dd14f84bbf2f749a66ef6473875bbc45ba23"} Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.622735 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3608498442b8cd208857c682ab04dd14f84bbf2f749a66ef6473875bbc45ba23" Feb 01 07:05:19 crc kubenswrapper[5127]: I0201 07:05:19.622749 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz" Feb 01 07:05:26 crc kubenswrapper[5127]: I0201 07:05:26.893471 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9"] Feb 01 07:05:26 crc kubenswrapper[5127]: E0201 07:05:26.894156 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerName="pull" Feb 01 07:05:26 crc kubenswrapper[5127]: I0201 07:05:26.894167 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerName="pull" Feb 01 07:05:26 crc kubenswrapper[5127]: E0201 07:05:26.894180 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerName="util" Feb 01 07:05:26 crc kubenswrapper[5127]: I0201 07:05:26.894186 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerName="util" Feb 01 07:05:26 crc kubenswrapper[5127]: E0201 07:05:26.894198 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerName="extract" Feb 01 07:05:26 crc kubenswrapper[5127]: I0201 07:05:26.894204 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerName="extract" Feb 01 07:05:26 crc kubenswrapper[5127]: I0201 07:05:26.894298 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="608c77ac-5d08-4c81-803d-b9aed3ba9d73" containerName="extract" Feb 01 07:05:26 crc kubenswrapper[5127]: I0201 07:05:26.894690 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" Feb 01 07:05:26 crc kubenswrapper[5127]: W0201 07:05:26.897004 5127 reflector.go:561] object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rz695": failed to list *v1.Secret: secrets "openstack-operator-controller-init-dockercfg-rz695" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Feb 01 07:05:26 crc kubenswrapper[5127]: E0201 07:05:26.897130 5127 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"openstack-operator-controller-init-dockercfg-rz695\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-operator-controller-init-dockercfg-rz695\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 01 07:05:26 crc kubenswrapper[5127]: I0201 07:05:26.918239 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9"] Feb 01 07:05:27 crc kubenswrapper[5127]: I0201 07:05:27.013984 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ght7m\" (UniqueName: \"kubernetes.io/projected/3d5d76ba-0b08-4c6e-b89d-fa438a766a13-kube-api-access-ght7m\") pod \"openstack-operator-controller-init-757f46c65d-d2mn9\" (UID: \"3d5d76ba-0b08-4c6e-b89d-fa438a766a13\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" Feb 01 07:05:27 crc kubenswrapper[5127]: I0201 07:05:27.115414 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ght7m\" (UniqueName: \"kubernetes.io/projected/3d5d76ba-0b08-4c6e-b89d-fa438a766a13-kube-api-access-ght7m\") pod \"openstack-operator-controller-init-757f46c65d-d2mn9\" (UID: \"3d5d76ba-0b08-4c6e-b89d-fa438a766a13\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" Feb 01 07:05:27 crc kubenswrapper[5127]: I0201 07:05:27.137246 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ght7m\" (UniqueName: \"kubernetes.io/projected/3d5d76ba-0b08-4c6e-b89d-fa438a766a13-kube-api-access-ght7m\") pod \"openstack-operator-controller-init-757f46c65d-d2mn9\" (UID: \"3d5d76ba-0b08-4c6e-b89d-fa438a766a13\") " pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" Feb 01 07:05:28 crc kubenswrapper[5127]: I0201 07:05:28.213832 5127 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" secret="" err="failed to sync secret cache: timed out waiting for the condition" Feb 01 07:05:28 crc kubenswrapper[5127]: I0201 07:05:28.214111 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" Feb 01 07:05:28 crc kubenswrapper[5127]: I0201 07:05:28.425597 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rz695" Feb 01 07:05:28 crc kubenswrapper[5127]: I0201 07:05:28.496335 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9"] Feb 01 07:05:28 crc kubenswrapper[5127]: I0201 07:05:28.687379 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" event={"ID":"3d5d76ba-0b08-4c6e-b89d-fa438a766a13","Type":"ContainerStarted","Data":"875400a2e9563415bb0eaf52784508226c425ffa8d508c3605c50411b01d52e8"} Feb 01 07:05:32 crc kubenswrapper[5127]: I0201 07:05:32.719415 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" event={"ID":"3d5d76ba-0b08-4c6e-b89d-fa438a766a13","Type":"ContainerStarted","Data":"76f5c635408fc78e707320afebb6d719e62005f7bbdae9396dd461c5edcd2006"} Feb 01 07:05:32 crc kubenswrapper[5127]: I0201 07:05:32.720123 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" Feb 01 07:05:32 crc kubenswrapper[5127]: I0201 07:05:32.761414 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" podStartSLOduration=2.862734289 podStartE2EDuration="6.761399548s" podCreationTimestamp="2026-02-01 07:05:26 +0000 UTC" firstStartedPulling="2026-02-01 07:05:28.521497675 +0000 UTC m=+1079.007400038" lastFinishedPulling="2026-02-01 07:05:32.420162914 +0000 UTC m=+1082.906065297" observedRunningTime="2026-02-01 07:05:32.757681867 +0000 UTC m=+1083.243584230" watchObservedRunningTime="2026-02-01 07:05:32.761399548 +0000 UTC m=+1083.247301911" Feb 01 07:05:38 crc kubenswrapper[5127]: I0201 07:05:38.216501 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-757f46c65d-d2mn9" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.131862 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.135622 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.138028 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.138247 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jfzmw" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.138809 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.140030 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-6vcj6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.145399 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.157157 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.158026 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.161222 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gfb76" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.170602 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.179987 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pps28\" (UniqueName: \"kubernetes.io/projected/41b36a3a-ccdb-4db2-b23b-110fdd81e06b-kube-api-access-pps28\") pod \"cinder-operator-controller-manager-8d874c8fc-msn79\" (UID: \"41b36a3a-ccdb-4db2-b23b-110fdd81e06b\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.180041 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gs4v\" (UniqueName: \"kubernetes.io/projected/e4b9555b-b0a0-48c0-a488-2fa76ba13e19-kube-api-access-8gs4v\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-5qgzf\" (UID: \"e4b9555b-b0a0-48c0-a488-2fa76ba13e19\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.180075 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9vx4\" (UniqueName: \"kubernetes.io/projected/155e3129-7d47-4eef-ae17-445a4847e3c4-kube-api-access-l9vx4\") pod \"designate-operator-controller-manager-6d9697b7f4-c5r2v\" (UID: \"155e3129-7d47-4eef-ae17-445a4847e3c4\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.199285 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.200159 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.202734 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-82cgb" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.206651 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.207564 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.213392 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-pfkz8" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.215395 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.292517 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9vx4\" (UniqueName: \"kubernetes.io/projected/155e3129-7d47-4eef-ae17-445a4847e3c4-kube-api-access-l9vx4\") pod \"designate-operator-controller-manager-6d9697b7f4-c5r2v\" (UID: \"155e3129-7d47-4eef-ae17-445a4847e3c4\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.292727 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pps28\" (UniqueName: \"kubernetes.io/projected/41b36a3a-ccdb-4db2-b23b-110fdd81e06b-kube-api-access-pps28\") pod \"cinder-operator-controller-manager-8d874c8fc-msn79\" (UID: \"41b36a3a-ccdb-4db2-b23b-110fdd81e06b\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.292783 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gs4v\" (UniqueName: \"kubernetes.io/projected/e4b9555b-b0a0-48c0-a488-2fa76ba13e19-kube-api-access-8gs4v\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-5qgzf\" (UID: \"e4b9555b-b0a0-48c0-a488-2fa76ba13e19\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.381563 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pps28\" (UniqueName: \"kubernetes.io/projected/41b36a3a-ccdb-4db2-b23b-110fdd81e06b-kube-api-access-pps28\") pod \"cinder-operator-controller-manager-8d874c8fc-msn79\" (UID: \"41b36a3a-ccdb-4db2-b23b-110fdd81e06b\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.381573 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.387383 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gs4v\" (UniqueName: \"kubernetes.io/projected/e4b9555b-b0a0-48c0-a488-2fa76ba13e19-kube-api-access-8gs4v\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-5qgzf\" (UID: \"e4b9555b-b0a0-48c0-a488-2fa76ba13e19\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.388435 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9vx4\" (UniqueName: \"kubernetes.io/projected/155e3129-7d47-4eef-ae17-445a4847e3c4-kube-api-access-l9vx4\") pod \"designate-operator-controller-manager-6d9697b7f4-c5r2v\" (UID: \"155e3129-7d47-4eef-ae17-445a4847e3c4\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.394258 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.400377 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k69jx\" (UniqueName: \"kubernetes.io/projected/dfba9a52-4b08-4001-bec8-0faf57fb61a0-kube-api-access-k69jx\") pod \"glance-operator-controller-manager-8886f4c47-sjjbc\" (UID: \"dfba9a52-4b08-4001-bec8-0faf57fb61a0\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.400419 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh7tx\" (UniqueName: \"kubernetes.io/projected/020efd87-3f4e-4762-9853-4f08d7f744cd-kube-api-access-mh7tx\") pod \"heat-operator-controller-manager-69d6db494d-k59z5\" (UID: \"020efd87-3f4e-4762-9853-4f08d7f744cd\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.407512 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.408536 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.419480 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-g78qw" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.430855 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.445100 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.446140 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.450365 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-2nrng" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.460195 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.473756 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.474175 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.498650 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.499464 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.500276 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.501134 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hscv\" (UniqueName: \"kubernetes.io/projected/b14e4493-339f-480c-84eb-7be38d967aef-kube-api-access-6hscv\") pod \"ironic-operator-controller-manager-5f4b8bd54d-kssvc\" (UID: \"b14e4493-339f-480c-84eb-7be38d967aef\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.501163 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k69jx\" (UniqueName: \"kubernetes.io/projected/dfba9a52-4b08-4001-bec8-0faf57fb61a0-kube-api-access-k69jx\") pod \"glance-operator-controller-manager-8886f4c47-sjjbc\" (UID: \"dfba9a52-4b08-4001-bec8-0faf57fb61a0\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.501188 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh7tx\" (UniqueName: \"kubernetes.io/projected/020efd87-3f4e-4762-9853-4f08d7f744cd-kube-api-access-mh7tx\") pod \"heat-operator-controller-manager-69d6db494d-k59z5\" (UID: \"020efd87-3f4e-4762-9853-4f08d7f744cd\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.501224 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jp8\" (UniqueName: \"kubernetes.io/projected/d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d-kube-api-access-96jp8\") pod \"horizon-operator-controller-manager-5fb775575f-lj688\" (UID: \"d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.505244 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-wrsdq" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.513630 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.514515 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.515875 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nszkn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.522662 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.524402 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.532048 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.532881 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.535236 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lwnz9" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.537446 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh7tx\" (UniqueName: \"kubernetes.io/projected/020efd87-3f4e-4762-9853-4f08d7f744cd-kube-api-access-mh7tx\") pod \"heat-operator-controller-manager-69d6db494d-k59z5\" (UID: \"020efd87-3f4e-4762-9853-4f08d7f744cd\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.537499 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k69jx\" (UniqueName: \"kubernetes.io/projected/dfba9a52-4b08-4001-bec8-0faf57fb61a0-kube-api-access-k69jx\") pod \"glance-operator-controller-manager-8886f4c47-sjjbc\" (UID: \"dfba9a52-4b08-4001-bec8-0faf57fb61a0\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.540703 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.548871 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.561799 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.565807 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.569714 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jvn88" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.570558 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.578848 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.580046 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.581566 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-nqr77" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.583468 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.584364 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.589750 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-s5gbn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.592126 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.600321 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602157 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hscv\" (UniqueName: \"kubernetes.io/projected/b14e4493-339f-480c-84eb-7be38d967aef-kube-api-access-6hscv\") pod \"ironic-operator-controller-manager-5f4b8bd54d-kssvc\" (UID: \"b14e4493-339f-480c-84eb-7be38d967aef\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602243 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9h24\" (UniqueName: \"kubernetes.io/projected/a050d4cf-e8ae-4983-aeef-5504bd4ffdc3-kube-api-access-p9h24\") pod \"manila-operator-controller-manager-7dd968899f-gv8nn\" (UID: \"a050d4cf-e8ae-4983-aeef-5504bd4ffdc3\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602277 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9tx\" (UniqueName: \"kubernetes.io/projected/60763bd0-4a99-48da-b53f-1dfddcfd2dda-kube-api-access-4g9tx\") pod \"mariadb-operator-controller-manager-67bf948998-ll87b\" (UID: \"60763bd0-4a99-48da-b53f-1dfddcfd2dda\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602305 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602341 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96jp8\" (UniqueName: \"kubernetes.io/projected/d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d-kube-api-access-96jp8\") pod \"horizon-operator-controller-manager-5fb775575f-lj688\" (UID: \"d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602391 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqsn\" (UniqueName: \"kubernetes.io/projected/181c451d-b9c2-4b75-b271-a3d33fc7c200-kube-api-access-thqsn\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602442 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf4jt\" (UniqueName: \"kubernetes.io/projected/6c57a02e-b635-4a20-921e-fc1ce29bd6e1-kube-api-access-tf4jt\") pod \"keystone-operator-controller-manager-84f48565d4-v8fwx\" (UID: \"6c57a02e-b635-4a20-921e-fc1ce29bd6e1\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.602763 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.605271 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ggvqd" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.606071 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.610516 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.620019 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.633504 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.633826 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hscv\" (UniqueName: \"kubernetes.io/projected/b14e4493-339f-480c-84eb-7be38d967aef-kube-api-access-6hscv\") pod \"ironic-operator-controller-manager-5f4b8bd54d-kssvc\" (UID: \"b14e4493-339f-480c-84eb-7be38d967aef\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.635016 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.639670 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qs6z4" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.639864 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.639944 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jp8\" (UniqueName: \"kubernetes.io/projected/d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d-kube-api-access-96jp8\") pod \"horizon-operator-controller-manager-5fb775575f-lj688\" (UID: \"d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.643201 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.644175 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.653300 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-dmp76" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.656813 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.663474 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.685442 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.686791 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.688468 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-5khcn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.692191 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-646f6"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.692878 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.696499 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kdbf4" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.705232 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-646f6"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.706808 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jmkx\" (UniqueName: \"kubernetes.io/projected/f972515b-c0d8-497e-87e8-ec5a8f3e4151-kube-api-access-6jmkx\") pod \"neutron-operator-controller-manager-585dbc889-2lrrd\" (UID: \"f972515b-c0d8-497e-87e8-ec5a8f3e4151\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.706871 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thqsn\" (UniqueName: \"kubernetes.io/projected/181c451d-b9c2-4b75-b271-a3d33fc7c200-kube-api-access-thqsn\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.706909 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.706932 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mzb9\" (UniqueName: \"kubernetes.io/projected/22ada428-9d6c-41dc-8c3f-a6684d72f4b3-kube-api-access-4mzb9\") pod \"ovn-operator-controller-manager-788c46999f-6z59p\" (UID: \"22ada428-9d6c-41dc-8c3f-a6684d72f4b3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.706958 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf4jt\" (UniqueName: \"kubernetes.io/projected/6c57a02e-b635-4a20-921e-fc1ce29bd6e1-kube-api-access-tf4jt\") pod \"keystone-operator-controller-manager-84f48565d4-v8fwx\" (UID: \"6c57a02e-b635-4a20-921e-fc1ce29bd6e1\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.706976 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4jgh\" (UniqueName: \"kubernetes.io/projected/cb22bae8-ed04-41f1-8061-149713da4d9f-kube-api-access-w4jgh\") pod \"octavia-operator-controller-manager-6687f8d877-kwkzq\" (UID: \"cb22bae8-ed04-41f1-8061-149713da4d9f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.707010 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjd7t\" (UniqueName: \"kubernetes.io/projected/be2cd21c-5775-450d-9933-9914e99730a6-kube-api-access-pjd7t\") pod \"nova-operator-controller-manager-55bff696bd-wgpr9\" (UID: \"be2cd21c-5775-450d-9933-9914e99730a6\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.707055 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9h24\" (UniqueName: \"kubernetes.io/projected/a050d4cf-e8ae-4983-aeef-5504bd4ffdc3-kube-api-access-p9h24\") pod \"manila-operator-controller-manager-7dd968899f-gv8nn\" (UID: \"a050d4cf-e8ae-4983-aeef-5504bd4ffdc3\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.707072 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9tx\" (UniqueName: \"kubernetes.io/projected/60763bd0-4a99-48da-b53f-1dfddcfd2dda-kube-api-access-4g9tx\") pod \"mariadb-operator-controller-manager-67bf948998-ll87b\" (UID: \"60763bd0-4a99-48da-b53f-1dfddcfd2dda\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.707094 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.707114 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cnb2\" (UniqueName: \"kubernetes.io/projected/a6886643-fe68-466d-ab2f-0dfd752dbe0f-kube-api-access-4cnb2\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:14 crc kubenswrapper[5127]: E0201 07:06:14.707631 5127 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:14 crc kubenswrapper[5127]: E0201 07:06:14.707687 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert podName:181c451d-b9c2-4b75-b271-a3d33fc7c200 nodeName:}" failed. No retries permitted until 2026-02-01 07:06:15.20766754 +0000 UTC m=+1125.693569903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert") pod "infra-operator-controller-manager-79955696d6-hv9b6" (UID: "181c451d-b9c2-4b75-b271-a3d33fc7c200") : secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.711514 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.754993 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqsn\" (UniqueName: \"kubernetes.io/projected/181c451d-b9c2-4b75-b271-a3d33fc7c200-kube-api-access-thqsn\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.765634 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9tx\" (UniqueName: \"kubernetes.io/projected/60763bd0-4a99-48da-b53f-1dfddcfd2dda-kube-api-access-4g9tx\") pod \"mariadb-operator-controller-manager-67bf948998-ll87b\" (UID: \"60763bd0-4a99-48da-b53f-1dfddcfd2dda\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.769175 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9h24\" (UniqueName: \"kubernetes.io/projected/a050d4cf-e8ae-4983-aeef-5504bd4ffdc3-kube-api-access-p9h24\") pod \"manila-operator-controller-manager-7dd968899f-gv8nn\" (UID: \"a050d4cf-e8ae-4983-aeef-5504bd4ffdc3\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.776340 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.776721 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf4jt\" (UniqueName: \"kubernetes.io/projected/6c57a02e-b635-4a20-921e-fc1ce29bd6e1-kube-api-access-tf4jt\") pod \"keystone-operator-controller-manager-84f48565d4-v8fwx\" (UID: \"6c57a02e-b635-4a20-921e-fc1ce29bd6e1\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.802775 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.805271 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.807452 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.809702 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jmkx\" (UniqueName: \"kubernetes.io/projected/f972515b-c0d8-497e-87e8-ec5a8f3e4151-kube-api-access-6jmkx\") pod \"neutron-operator-controller-manager-585dbc889-2lrrd\" (UID: \"f972515b-c0d8-497e-87e8-ec5a8f3e4151\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.809765 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.809792 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mzb9\" (UniqueName: \"kubernetes.io/projected/22ada428-9d6c-41dc-8c3f-a6684d72f4b3-kube-api-access-4mzb9\") pod \"ovn-operator-controller-manager-788c46999f-6z59p\" (UID: \"22ada428-9d6c-41dc-8c3f-a6684d72f4b3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.809821 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4jgh\" (UniqueName: \"kubernetes.io/projected/cb22bae8-ed04-41f1-8061-149713da4d9f-kube-api-access-w4jgh\") pod \"octavia-operator-controller-manager-6687f8d877-kwkzq\" (UID: \"cb22bae8-ed04-41f1-8061-149713da4d9f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.809855 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjd7t\" (UniqueName: \"kubernetes.io/projected/be2cd21c-5775-450d-9933-9914e99730a6-kube-api-access-pjd7t\") pod \"nova-operator-controller-manager-55bff696bd-wgpr9\" (UID: \"be2cd21c-5775-450d-9933-9914e99730a6\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.809943 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8tzh\" (UniqueName: \"kubernetes.io/projected/464ffa34-bb5b-4e78-9fa1-d106fd67de1d-kube-api-access-f8tzh\") pod \"swift-operator-controller-manager-68fc8c869-646f6\" (UID: \"464ffa34-bb5b-4e78-9fa1-d106fd67de1d\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.809968 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8chs\" (UniqueName: \"kubernetes.io/projected/71b843c8-50d1-4b1b-83ca-33d72bb16b5e-kube-api-access-z8chs\") pod \"placement-operator-controller-manager-5b964cf4cd-5wvqj\" (UID: \"71b843c8-50d1-4b1b-83ca-33d72bb16b5e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.810040 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cnb2\" (UniqueName: \"kubernetes.io/projected/a6886643-fe68-466d-ab2f-0dfd752dbe0f-kube-api-access-4cnb2\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:14 crc kubenswrapper[5127]: E0201 07:06:14.810643 5127 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:14 crc kubenswrapper[5127]: E0201 07:06:14.810688 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert podName:a6886643-fe68-466d-ab2f-0dfd752dbe0f nodeName:}" failed. No retries permitted until 2026-02-01 07:06:15.310675817 +0000 UTC m=+1125.796578180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" (UID: "a6886643-fe68-466d-ab2f-0dfd752dbe0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.816426 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rr8pn" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.879252 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.879738 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.883877 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mzb9\" (UniqueName: \"kubernetes.io/projected/22ada428-9d6c-41dc-8c3f-a6684d72f4b3-kube-api-access-4mzb9\") pod \"ovn-operator-controller-manager-788c46999f-6z59p\" (UID: \"22ada428-9d6c-41dc-8c3f-a6684d72f4b3\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.890064 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cnb2\" (UniqueName: \"kubernetes.io/projected/a6886643-fe68-466d-ab2f-0dfd752dbe0f-kube-api-access-4cnb2\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.890412 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jmkx\" (UniqueName: \"kubernetes.io/projected/f972515b-c0d8-497e-87e8-ec5a8f3e4151-kube-api-access-6jmkx\") pod \"neutron-operator-controller-manager-585dbc889-2lrrd\" (UID: \"f972515b-c0d8-497e-87e8-ec5a8f3e4151\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.891296 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjd7t\" (UniqueName: \"kubernetes.io/projected/be2cd21c-5775-450d-9933-9914e99730a6-kube-api-access-pjd7t\") pod \"nova-operator-controller-manager-55bff696bd-wgpr9\" (UID: \"be2cd21c-5775-450d-9933-9914e99730a6\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.893395 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4jgh\" (UniqueName: \"kubernetes.io/projected/cb22bae8-ed04-41f1-8061-149713da4d9f-kube-api-access-w4jgh\") pod \"octavia-operator-controller-manager-6687f8d877-kwkzq\" (UID: \"cb22bae8-ed04-41f1-8061-149713da4d9f\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.906257 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.911028 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8tzh\" (UniqueName: \"kubernetes.io/projected/464ffa34-bb5b-4e78-9fa1-d106fd67de1d-kube-api-access-f8tzh\") pod \"swift-operator-controller-manager-68fc8c869-646f6\" (UID: \"464ffa34-bb5b-4e78-9fa1-d106fd67de1d\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.911062 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8chs\" (UniqueName: \"kubernetes.io/projected/71b843c8-50d1-4b1b-83ca-33d72bb16b5e-kube-api-access-z8chs\") pod \"placement-operator-controller-manager-5b964cf4cd-5wvqj\" (UID: \"71b843c8-50d1-4b1b-83ca-33d72bb16b5e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.911200 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hvpq\" (UniqueName: \"kubernetes.io/projected/43e73360-cfda-420c-8df1-fe2e50b31d0c-kube-api-access-2hvpq\") pod \"telemetry-operator-controller-manager-64b5b76f97-9r2rh\" (UID: \"43e73360-cfda-420c-8df1-fe2e50b31d0c\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.932915 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8chs\" (UniqueName: \"kubernetes.io/projected/71b843c8-50d1-4b1b-83ca-33d72bb16b5e-kube-api-access-z8chs\") pod \"placement-operator-controller-manager-5b964cf4cd-5wvqj\" (UID: \"71b843c8-50d1-4b1b-83ca-33d72bb16b5e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.933254 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.934226 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8tzh\" (UniqueName: \"kubernetes.io/projected/464ffa34-bb5b-4e78-9fa1-d106fd67de1d-kube-api-access-f8tzh\") pod \"swift-operator-controller-manager-68fc8c869-646f6\" (UID: \"464ffa34-bb5b-4e78-9fa1-d106fd67de1d\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.935777 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.940989 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.942046 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.946296 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xhc5x" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.962008 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv"] Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.977745 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" Feb 01 07:06:14 crc kubenswrapper[5127]: I0201 07:06:14.983315 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.009775 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.014324 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9zgg\" (UniqueName: \"kubernetes.io/projected/c25dac7e-0533-4ac5-9fc8-cabbf5e340bc-kube-api-access-r9zgg\") pod \"test-operator-controller-manager-56f8bfcd9f-c89bv\" (UID: \"c25dac7e-0533-4ac5-9fc8-cabbf5e340bc\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.014396 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hvpq\" (UniqueName: \"kubernetes.io/projected/43e73360-cfda-420c-8df1-fe2e50b31d0c-kube-api-access-2hvpq\") pod \"telemetry-operator-controller-manager-64b5b76f97-9r2rh\" (UID: \"43e73360-cfda-420c-8df1-fe2e50b31d0c\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.042289 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hvpq\" (UniqueName: \"kubernetes.io/projected/43e73360-cfda-420c-8df1-fe2e50b31d0c-kube-api-access-2hvpq\") pod \"telemetry-operator-controller-manager-64b5b76f97-9r2rh\" (UID: \"43e73360-cfda-420c-8df1-fe2e50b31d0c\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.042354 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-97rgn"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.043277 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.044804 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.045192 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.050331 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vkxjr" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.061290 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-97rgn"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.077042 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.078019 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.080246 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.082832 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.083014 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.083141 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jb97g" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.091694 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.096502 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.105054 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.105447 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.109229 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-bgv9b" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.115496 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.115651 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlzwj\" (UniqueName: \"kubernetes.io/projected/ec968356-989e-4e17-b755-66c8a2b8109a-kube-api-access-zlzwj\") pod \"watcher-operator-controller-manager-564965969-97rgn\" (UID: \"ec968356-989e-4e17-b755-66c8a2b8109a\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.115682 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkmx5\" (UniqueName: \"kubernetes.io/projected/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-kube-api-access-pkmx5\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.115734 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9zgg\" (UniqueName: \"kubernetes.io/projected/c25dac7e-0533-4ac5-9fc8-cabbf5e340bc-kube-api-access-r9zgg\") pod \"test-operator-controller-manager-56f8bfcd9f-c89bv\" (UID: \"c25dac7e-0533-4ac5-9fc8-cabbf5e340bc\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.117179 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.139464 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9zgg\" (UniqueName: \"kubernetes.io/projected/c25dac7e-0533-4ac5-9fc8-cabbf5e340bc-kube-api-access-r9zgg\") pod \"test-operator-controller-manager-56f8bfcd9f-c89bv\" (UID: \"c25dac7e-0533-4ac5-9fc8-cabbf5e340bc\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.144673 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.218331 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.218388 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.218424 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.218456 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlzwj\" (UniqueName: \"kubernetes.io/projected/ec968356-989e-4e17-b755-66c8a2b8109a-kube-api-access-zlzwj\") pod \"watcher-operator-controller-manager-564965969-97rgn\" (UID: \"ec968356-989e-4e17-b755-66c8a2b8109a\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.218474 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkmx5\" (UniqueName: \"kubernetes.io/projected/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-kube-api-access-pkmx5\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.218527 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb5gp\" (UniqueName: \"kubernetes.io/projected/144ee88b-a5c7-46da-9e39-8f3c71d9499d-kube-api-access-vb5gp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bbcxg\" (UID: \"144ee88b-a5c7-46da-9e39-8f3c71d9499d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.218709 5127 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.218750 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:15.71873556 +0000 UTC m=+1126.204637923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "metrics-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.218951 5127 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.218984 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:15.718976156 +0000 UTC m=+1126.204878519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.219018 5127 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.219036 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert podName:181c451d-b9c2-4b75-b271-a3d33fc7c200 nodeName:}" failed. No retries permitted until 2026-02-01 07:06:16.219030998 +0000 UTC m=+1126.704933361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert") pod "infra-operator-controller-manager-79955696d6-hv9b6" (UID: "181c451d-b9c2-4b75-b271-a3d33fc7c200") : secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.244046 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlzwj\" (UniqueName: \"kubernetes.io/projected/ec968356-989e-4e17-b755-66c8a2b8109a-kube-api-access-zlzwj\") pod \"watcher-operator-controller-manager-564965969-97rgn\" (UID: \"ec968356-989e-4e17-b755-66c8a2b8109a\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.250055 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkmx5\" (UniqueName: \"kubernetes.io/projected/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-kube-api-access-pkmx5\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.286595 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.312063 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.322990 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.323030 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb5gp\" (UniqueName: \"kubernetes.io/projected/144ee88b-a5c7-46da-9e39-8f3c71d9499d-kube-api-access-vb5gp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bbcxg\" (UID: \"144ee88b-a5c7-46da-9e39-8f3c71d9499d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.324044 5127 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.324107 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert podName:a6886643-fe68-466d-ab2f-0dfd752dbe0f nodeName:}" failed. No retries permitted until 2026-02-01 07:06:16.32409138 +0000 UTC m=+1126.809993743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" (UID: "a6886643-fe68-466d-ab2f-0dfd752dbe0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.333477 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.348000 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.355858 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb5gp\" (UniqueName: \"kubernetes.io/projected/144ee88b-a5c7-46da-9e39-8f3c71d9499d-kube-api-access-vb5gp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-bbcxg\" (UID: \"144ee88b-a5c7-46da-9e39-8f3c71d9499d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.364720 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5"] Feb 01 07:06:15 crc kubenswrapper[5127]: W0201 07:06:15.410814 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod020efd87_3f4e_4762_9853_4f08d7f744cd.slice/crio-c2b34c5a53adcd4bdd983f604b5c808e7cfd8137391d3dec949665d2f3a4ea45 WatchSource:0}: Error finding container c2b34c5a53adcd4bdd983f604b5c808e7cfd8137391d3dec949665d2f3a4ea45: Status 404 returned error can't find the container with id c2b34c5a53adcd4bdd983f604b5c808e7cfd8137391d3dec949665d2f3a4ea45 Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.449985 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.469208 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.576228 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.670310 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.744593 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.749727 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.744840 5127 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.755199 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:16.755171786 +0000 UTC m=+1127.241074149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "metrics-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.749915 5127 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: E0201 07:06:15.755680 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:16.755667369 +0000 UTC m=+1127.241569742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "webhook-server-cert" not found Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.844157 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-646f6"] Feb 01 07:06:15 crc kubenswrapper[5127]: W0201 07:06:15.875998 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ada428_9d6c_41dc_8c3f_a6684d72f4b3.slice/crio-e7fdf1ff084d77950d53b47dae66571f872174ce74fb78f5e2b2c5de8c98f719 WatchSource:0}: Error finding container e7fdf1ff084d77950d53b47dae66571f872174ce74fb78f5e2b2c5de8c98f719: Status 404 returned error can't find the container with id e7fdf1ff084d77950d53b47dae66571f872174ce74fb78f5e2b2c5de8c98f719 Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.878728 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p"] Feb 01 07:06:15 crc kubenswrapper[5127]: I0201 07:06:15.887094 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.080367 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.098947 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj"] Feb 01 07:06:16 crc kubenswrapper[5127]: W0201 07:06:16.115698 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60763bd0_4a99_48da_b53f_1dfddcfd2dda.slice/crio-19fcd90ab48e2da3c095d83c7b31b2a0c5fc71f720b2109e60f97cffdff2df6b WatchSource:0}: Error finding container 19fcd90ab48e2da3c095d83c7b31b2a0c5fc71f720b2109e60f97cffdff2df6b: Status 404 returned error can't find the container with id 19fcd90ab48e2da3c095d83c7b31b2a0c5fc71f720b2109e60f97cffdff2df6b Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.117060 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.133938 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx"] Feb 01 07:06:16 crc kubenswrapper[5127]: W0201 07:06:16.135077 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda050d4cf_e8ae_4983_aeef_5504bd4ffdc3.slice/crio-73fa72e9efc547a2834f6ff8ad8b8226b0cd362a89956de8d44e567be24cca76 WatchSource:0}: Error finding container 73fa72e9efc547a2834f6ff8ad8b8226b0cd362a89956de8d44e567be24cca76: Status 404 returned error can't find the container with id 73fa72e9efc547a2834f6ff8ad8b8226b0cd362a89956de8d44e567be24cca76 Feb 01 07:06:16 crc kubenswrapper[5127]: W0201 07:06:16.151397 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c57a02e_b635_4a20_921e_fc1ce29bd6e1.slice/crio-650007a4a6f4c71cdc373b80b1bd99f31bf81a7ca9faea15724d6b37718534db WatchSource:0}: Error finding container 650007a4a6f4c71cdc373b80b1bd99f31bf81a7ca9faea15724d6b37718534db: Status 404 returned error can't find the container with id 650007a4a6f4c71cdc373b80b1bd99f31bf81a7ca9faea15724d6b37718534db Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.199036 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" event={"ID":"155e3129-7d47-4eef-ae17-445a4847e3c4","Type":"ContainerStarted","Data":"cd50d4fbdd7a41b64a72da2e73ef6fb3419d9b9d32c91438dadde308f77b59ec"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.202232 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-97rgn"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.205725 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" event={"ID":"22ada428-9d6c-41dc-8c3f-a6684d72f4b3","Type":"ContainerStarted","Data":"e7fdf1ff084d77950d53b47dae66571f872174ce74fb78f5e2b2c5de8c98f719"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.209541 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" event={"ID":"e4b9555b-b0a0-48c0-a488-2fa76ba13e19","Type":"ContainerStarted","Data":"1331acc561ac34684e990d02a1ba72131c62b6aebfa6a23cd2eb64172769da03"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.211319 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" event={"ID":"a050d4cf-e8ae-4983-aeef-5504bd4ffdc3","Type":"ContainerStarted","Data":"73fa72e9efc547a2834f6ff8ad8b8226b0cd362a89956de8d44e567be24cca76"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.212575 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.217512 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" event={"ID":"d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d","Type":"ContainerStarted","Data":"2014749e73329594597099fb185f8b6682b61f00de98eacab677004c5826c9e3"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.224189 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" event={"ID":"6c57a02e-b635-4a20-921e-fc1ce29bd6e1","Type":"ContainerStarted","Data":"650007a4a6f4c71cdc373b80b1bd99f31bf81a7ca9faea15724d6b37718534db"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.227062 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" event={"ID":"41b36a3a-ccdb-4db2-b23b-110fdd81e06b","Type":"ContainerStarted","Data":"513f9b6c239e2f6ceb619cac565a5f886e0b237bd0bf6c75a7cf603fa44cceb6"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.228920 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.230993 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" event={"ID":"60763bd0-4a99-48da-b53f-1dfddcfd2dda","Type":"ContainerStarted","Data":"19fcd90ab48e2da3c095d83c7b31b2a0c5fc71f720b2109e60f97cffdff2df6b"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.234108 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" event={"ID":"464ffa34-bb5b-4e78-9fa1-d106fd67de1d","Type":"ContainerStarted","Data":"7f50436534098d89ec0d36613658a5181853f42a01542d20aae5c201a7e4852c"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.245768 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" event={"ID":"dfba9a52-4b08-4001-bec8-0faf57fb61a0","Type":"ContainerStarted","Data":"0fc1388b60cfec67278105b7b69c676e48f8a86bdcfc94f070306a3e081d7060"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.245838 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" event={"ID":"020efd87-3f4e-4762-9853-4f08d7f744cd","Type":"ContainerStarted","Data":"c2b34c5a53adcd4bdd983f604b5c808e7cfd8137391d3dec949665d2f3a4ea45"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.245853 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" event={"ID":"71b843c8-50d1-4b1b-83ca-33d72bb16b5e","Type":"ContainerStarted","Data":"924554bd5540f27fbac3042564108961910cc2c44eb85687cdc0b3a6966113d9"} Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.249589 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.252008 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" event={"ID":"b14e4493-339f-480c-84eb-7be38d967aef","Type":"ContainerStarted","Data":"1acea72eed1e5858b2baeb7c6b3c2ad37748c5688d3cccfcfce31926b5192c67"} Feb 01 07:06:16 crc kubenswrapper[5127]: W0201 07:06:16.254293 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e73360_cfda_420c_8df1_fe2e50b31d0c.slice/crio-ba9cd6c15492cadeb44f72b5573845b1a536ef2dfe078bfb49468c6e3efecb0b WatchSource:0}: Error finding container ba9cd6c15492cadeb44f72b5573845b1a536ef2dfe078bfb49468c6e3efecb0b: Status 404 returned error can't find the container with id ba9cd6c15492cadeb44f72b5573845b1a536ef2dfe078bfb49468c6e3efecb0b Feb 01 07:06:16 crc kubenswrapper[5127]: W0201 07:06:16.261647 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf972515b_c0d8_497e_87e8_ec5a8f3e4151.slice/crio-c73bc3e803032fecda733f42ac62321c12cfebfc0085269c5a79c4e0a643986c WatchSource:0}: Error finding container c73bc3e803032fecda733f42ac62321c12cfebfc0085269c5a79c4e0a643986c: Status 404 returned error can't find the container with id c73bc3e803032fecda733f42ac62321c12cfebfc0085269c5a79c4e0a643986c Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.263896 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.264249 5127 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.264296 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert podName:181c451d-b9c2-4b75-b271-a3d33fc7c200 nodeName:}" failed. No retries permitted until 2026-02-01 07:06:18.264282363 +0000 UTC m=+1128.750184726 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert") pod "infra-operator-controller-manager-79955696d6-hv9b6" (UID: "181c451d-b9c2-4b75-b271-a3d33fc7c200") : secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.266068 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6jmkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-585dbc889-2lrrd_openstack-operators(f972515b-c0d8-497e-87e8-ec5a8f3e4151): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.267498 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" podUID="f972515b-c0d8-497e-87e8-ec5a8f3e4151" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.267787 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pjd7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-wgpr9_openstack-operators(be2cd21c-5775-450d-9933-9914e99730a6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.270741 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" podUID="be2cd21c-5775-450d-9933-9914e99730a6" Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.294627 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg"] Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.299912 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv"] Feb 01 07:06:16 crc kubenswrapper[5127]: W0201 07:06:16.307491 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25dac7e_0533_4ac5_9fc8_cabbf5e340bc.slice/crio-536e3753f0cfb672e3ba945831c1c69e003bb8fb095220d398aa64beeb138ee6 WatchSource:0}: Error finding container 536e3753f0cfb672e3ba945831c1c69e003bb8fb095220d398aa64beeb138ee6: Status 404 returned error can't find the container with id 536e3753f0cfb672e3ba945831c1c69e003bb8fb095220d398aa64beeb138ee6 Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.310265 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq"] Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.313543 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9zgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-c89bv_openstack-operators(c25dac7e-0533-4ac5-9fc8-cabbf5e340bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.315112 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" podUID="c25dac7e-0533-4ac5-9fc8-cabbf5e340bc" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.317704 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vb5gp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-bbcxg_openstack-operators(144ee88b-a5c7-46da-9e39-8f3c71d9499d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.318995 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" podUID="144ee88b-a5c7-46da-9e39-8f3c71d9499d" Feb 01 07:06:16 crc kubenswrapper[5127]: W0201 07:06:16.328012 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb22bae8_ed04_41f1_8061_149713da4d9f.slice/crio-1c2a8065cd41266b7905e5bc2a8bfc0c0de118221195864c80dbc51f86c1d20c WatchSource:0}: Error finding container 1c2a8065cd41266b7905e5bc2a8bfc0c0de118221195864c80dbc51f86c1d20c: Status 404 returned error can't find the container with id 1c2a8065cd41266b7905e5bc2a8bfc0c0de118221195864c80dbc51f86c1d20c Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.332096 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w4jgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-kwkzq_openstack-operators(cb22bae8-ed04-41f1-8061-149713da4d9f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.333807 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" podUID="cb22bae8-ed04-41f1-8061-149713da4d9f" Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.366056 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.366230 5127 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.366274 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert podName:a6886643-fe68-466d-ab2f-0dfd752dbe0f nodeName:}" failed. No retries permitted until 2026-02-01 07:06:18.366260982 +0000 UTC m=+1128.852163345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" (UID: "a6886643-fe68-466d-ab2f-0dfd752dbe0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.771474 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.771767 5127 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 07:06:16 crc kubenswrapper[5127]: I0201 07:06:16.771830 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.771894 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:18.771866408 +0000 UTC m=+1129.257768771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "webhook-server-cert" not found Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.771958 5127 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 07:06:16 crc kubenswrapper[5127]: E0201 07:06:16.772015 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:18.771999402 +0000 UTC m=+1129.257901765 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "metrics-server-cert" not found Feb 01 07:06:17 crc kubenswrapper[5127]: I0201 07:06:17.283355 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" event={"ID":"f972515b-c0d8-497e-87e8-ec5a8f3e4151","Type":"ContainerStarted","Data":"c73bc3e803032fecda733f42ac62321c12cfebfc0085269c5a79c4e0a643986c"} Feb 01 07:06:17 crc kubenswrapper[5127]: E0201 07:06:17.285184 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" podUID="f972515b-c0d8-497e-87e8-ec5a8f3e4151" Feb 01 07:06:17 crc kubenswrapper[5127]: I0201 07:06:17.287003 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" event={"ID":"c25dac7e-0533-4ac5-9fc8-cabbf5e340bc","Type":"ContainerStarted","Data":"536e3753f0cfb672e3ba945831c1c69e003bb8fb095220d398aa64beeb138ee6"} Feb 01 07:06:17 crc kubenswrapper[5127]: I0201 07:06:17.288202 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" event={"ID":"43e73360-cfda-420c-8df1-fe2e50b31d0c","Type":"ContainerStarted","Data":"ba9cd6c15492cadeb44f72b5573845b1a536ef2dfe078bfb49468c6e3efecb0b"} Feb 01 07:06:17 crc kubenswrapper[5127]: E0201 07:06:17.288957 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" podUID="c25dac7e-0533-4ac5-9fc8-cabbf5e340bc" Feb 01 07:06:17 crc kubenswrapper[5127]: I0201 07:06:17.290310 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" event={"ID":"ec968356-989e-4e17-b755-66c8a2b8109a","Type":"ContainerStarted","Data":"84c249d5722180438c3aa04f9193d2586cac6b300b9c7ac8fb3624ff2646816c"} Feb 01 07:06:17 crc kubenswrapper[5127]: I0201 07:06:17.293562 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" event={"ID":"be2cd21c-5775-450d-9933-9914e99730a6","Type":"ContainerStarted","Data":"2066affe1dd31cd6603b2039a6322df84d45f88d91be450ef38694aaff926740"} Feb 01 07:06:17 crc kubenswrapper[5127]: I0201 07:06:17.301713 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" event={"ID":"cb22bae8-ed04-41f1-8061-149713da4d9f","Type":"ContainerStarted","Data":"1c2a8065cd41266b7905e5bc2a8bfc0c0de118221195864c80dbc51f86c1d20c"} Feb 01 07:06:17 crc kubenswrapper[5127]: I0201 07:06:17.303980 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" event={"ID":"144ee88b-a5c7-46da-9e39-8f3c71d9499d","Type":"ContainerStarted","Data":"224479bdd62736a31cb9bddf27b61a1108cc694b5bfddd44c891d8e6ac28ed74"} Feb 01 07:06:17 crc kubenswrapper[5127]: E0201 07:06:17.307290 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" podUID="be2cd21c-5775-450d-9933-9914e99730a6" Feb 01 07:06:17 crc kubenswrapper[5127]: E0201 07:06:17.307529 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" podUID="cb22bae8-ed04-41f1-8061-149713da4d9f" Feb 01 07:06:17 crc kubenswrapper[5127]: E0201 07:06:17.308020 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" podUID="144ee88b-a5c7-46da-9e39-8f3c71d9499d" Feb 01 07:06:18 crc kubenswrapper[5127]: I0201 07:06:18.299975 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.300161 5127 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.300209 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert podName:181c451d-b9c2-4b75-b271-a3d33fc7c200 nodeName:}" failed. No retries permitted until 2026-02-01 07:06:22.300194245 +0000 UTC m=+1132.786096608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert") pod "infra-operator-controller-manager-79955696d6-hv9b6" (UID: "181c451d-b9c2-4b75-b271-a3d33fc7c200") : secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.312424 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" podUID="be2cd21c-5775-450d-9933-9914e99730a6" Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.312893 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" podUID="f972515b-c0d8-497e-87e8-ec5a8f3e4151" Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.313726 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" podUID="cb22bae8-ed04-41f1-8061-149713da4d9f" Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.314043 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" podUID="c25dac7e-0533-4ac5-9fc8-cabbf5e340bc" Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.314026 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" podUID="144ee88b-a5c7-46da-9e39-8f3c71d9499d" Feb 01 07:06:18 crc kubenswrapper[5127]: I0201 07:06:18.401257 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.401908 5127 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.401966 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert podName:a6886643-fe68-466d-ab2f-0dfd752dbe0f nodeName:}" failed. No retries permitted until 2026-02-01 07:06:22.401944909 +0000 UTC m=+1132.887847272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" (UID: "a6886643-fe68-466d-ab2f-0dfd752dbe0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:18 crc kubenswrapper[5127]: I0201 07:06:18.811004 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:18 crc kubenswrapper[5127]: I0201 07:06:18.811439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.811535 5127 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.811732 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:22.811703537 +0000 UTC m=+1133.297605900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "metrics-server-cert" not found Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.811602 5127 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 07:06:18 crc kubenswrapper[5127]: E0201 07:06:18.811902 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:22.811867911 +0000 UTC m=+1133.297770474 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "webhook-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: I0201 07:06:22.361949 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.362085 5127 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.362949 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert podName:181c451d-b9c2-4b75-b271-a3d33fc7c200 nodeName:}" failed. No retries permitted until 2026-02-01 07:06:30.362929064 +0000 UTC m=+1140.848831427 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert") pod "infra-operator-controller-manager-79955696d6-hv9b6" (UID: "181c451d-b9c2-4b75-b271-a3d33fc7c200") : secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: I0201 07:06:22.463953 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.464115 5127 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.464205 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert podName:a6886643-fe68-466d-ab2f-0dfd752dbe0f nodeName:}" failed. No retries permitted until 2026-02-01 07:06:30.464183374 +0000 UTC m=+1140.950085727 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" (UID: "a6886643-fe68-466d-ab2f-0dfd752dbe0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: I0201 07:06:22.870919 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:22 crc kubenswrapper[5127]: I0201 07:06:22.871027 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.871162 5127 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.871228 5127 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.871268 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:30.871240639 +0000 UTC m=+1141.357143012 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "metrics-server-cert" not found Feb 01 07:06:22 crc kubenswrapper[5127]: E0201 07:06:22.871311 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:30.871294291 +0000 UTC m=+1141.357196654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "webhook-server-cert" not found Feb 01 07:06:28 crc kubenswrapper[5127]: E0201 07:06:28.409920 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10" Feb 01 07:06:28 crc kubenswrapper[5127]: E0201 07:06:28.410722 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mh7tx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69d6db494d-k59z5_openstack-operators(020efd87-3f4e-4762-9853-4f08d7f744cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 07:06:28 crc kubenswrapper[5127]: E0201 07:06:28.412120 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" podUID="020efd87-3f4e-4762-9853-4f08d7f744cd" Feb 01 07:06:28 crc kubenswrapper[5127]: E0201 07:06:28.643838 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" podUID="020efd87-3f4e-4762-9853-4f08d7f744cd" Feb 01 07:06:29 crc kubenswrapper[5127]: E0201 07:06:29.059056 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a" Feb 01 07:06:29 crc kubenswrapper[5127]: E0201 07:06:29.059251 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2hvpq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-9r2rh_openstack-operators(43e73360-cfda-420c-8df1-fe2e50b31d0c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 07:06:29 crc kubenswrapper[5127]: E0201 07:06:29.060673 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" podUID="43e73360-cfda-420c-8df1-fe2e50b31d0c" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.656119 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" event={"ID":"155e3129-7d47-4eef-ae17-445a4847e3c4","Type":"ContainerStarted","Data":"f2393d846972f7e299e876ebafba1e8a73d3aab321fed0435c67b59162a5c8e7"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.656469 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.664561 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" event={"ID":"e4b9555b-b0a0-48c0-a488-2fa76ba13e19","Type":"ContainerStarted","Data":"a89f9d23eb7122d67eb4724fea1ab5d1789d7919ed896a1d6bbeacf4f233b585"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.664713 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.666344 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" event={"ID":"71b843c8-50d1-4b1b-83ca-33d72bb16b5e","Type":"ContainerStarted","Data":"d67faa7711307b54aaeeeff77fe01f1d0be5ae66f55adf1c98e351ad90f052fa"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.666966 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.669226 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" event={"ID":"41b36a3a-ccdb-4db2-b23b-110fdd81e06b","Type":"ContainerStarted","Data":"2c8e8b3d97838673284d66c705ecba9109db26912703c76c020642223aac9116"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.669283 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.670846 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" event={"ID":"60763bd0-4a99-48da-b53f-1dfddcfd2dda","Type":"ContainerStarted","Data":"abebb92e03456ee3c799fdd6921929ff48c44aecfc437f394425d89a9dee5153"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.670921 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.672556 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" event={"ID":"ec968356-989e-4e17-b755-66c8a2b8109a","Type":"ContainerStarted","Data":"403a941ba558cace889d159276b23a5603d2f70ca6f754b1a45375308b802464"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.672655 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.674273 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" event={"ID":"dfba9a52-4b08-4001-bec8-0faf57fb61a0","Type":"ContainerStarted","Data":"4f2d8a10a897d532b647a0d2f83bf9441be432888db53839b9f81dc7cbbb7426"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.674394 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.685148 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" podStartSLOduration=1.973198326 podStartE2EDuration="15.685132466s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.380163328 +0000 UTC m=+1125.866065691" lastFinishedPulling="2026-02-01 07:06:29.092097458 +0000 UTC m=+1139.577999831" observedRunningTime="2026-02-01 07:06:29.676933984 +0000 UTC m=+1140.162836347" watchObservedRunningTime="2026-02-01 07:06:29.685132466 +0000 UTC m=+1140.171034829" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.689048 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" event={"ID":"6c57a02e-b635-4a20-921e-fc1ce29bd6e1","Type":"ContainerStarted","Data":"e5ff9a22ae94c20eafc10344785259d84294b97250ba8ea9a721e5bcfbc6edcc"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.689742 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.704134 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" event={"ID":"b14e4493-339f-480c-84eb-7be38d967aef","Type":"ContainerStarted","Data":"452492c9de1bae2f62feebd512676c11cce757c08cd2701eee98e37875af4574"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.704866 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.707447 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" event={"ID":"22ada428-9d6c-41dc-8c3f-a6684d72f4b3","Type":"ContainerStarted","Data":"7d5d96b2aa1644189460012f6a1a26cddc5e5b00eaaf40592f615d04d3a9fbde"} Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.707567 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" Feb 01 07:06:29 crc kubenswrapper[5127]: E0201 07:06:29.708055 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" podUID="43e73360-cfda-420c-8df1-fe2e50b31d0c" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.734788 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" podStartSLOduration=2.063965752 podStartE2EDuration="15.734769999s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.422416881 +0000 UTC m=+1125.908319244" lastFinishedPulling="2026-02-01 07:06:29.093221118 +0000 UTC m=+1139.579123491" observedRunningTime="2026-02-01 07:06:29.730123674 +0000 UTC m=+1140.216026037" watchObservedRunningTime="2026-02-01 07:06:29.734769999 +0000 UTC m=+1140.220672362" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.764512 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" podStartSLOduration=2.794684545 podStartE2EDuration="15.764495634s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.122939937 +0000 UTC m=+1126.608842300" lastFinishedPulling="2026-02-01 07:06:29.092751006 +0000 UTC m=+1139.578653389" observedRunningTime="2026-02-01 07:06:29.760108394 +0000 UTC m=+1140.246010757" watchObservedRunningTime="2026-02-01 07:06:29.764495634 +0000 UTC m=+1140.250397997" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.785187 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" podStartSLOduration=2.815910199 podStartE2EDuration="15.785166423s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.123460001 +0000 UTC m=+1126.609362374" lastFinishedPulling="2026-02-01 07:06:29.092716235 +0000 UTC m=+1139.578618598" observedRunningTime="2026-02-01 07:06:29.783459237 +0000 UTC m=+1140.269361600" watchObservedRunningTime="2026-02-01 07:06:29.785166423 +0000 UTC m=+1140.271068786" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.813869 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" podStartSLOduration=1.985318133 podStartE2EDuration="15.813850648s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.261795855 +0000 UTC m=+1125.747698218" lastFinishedPulling="2026-02-01 07:06:29.09032837 +0000 UTC m=+1139.576230733" observedRunningTime="2026-02-01 07:06:29.811240489 +0000 UTC m=+1140.297142852" watchObservedRunningTime="2026-02-01 07:06:29.813850648 +0000 UTC m=+1140.299753011" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.855820 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" podStartSLOduration=2.989780354 podStartE2EDuration="15.855803424s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.226461209 +0000 UTC m=+1126.712363572" lastFinishedPulling="2026-02-01 07:06:29.092484279 +0000 UTC m=+1139.578386642" observedRunningTime="2026-02-01 07:06:29.837886339 +0000 UTC m=+1140.323788702" watchObservedRunningTime="2026-02-01 07:06:29.855803424 +0000 UTC m=+1140.341705787" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.860549 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" podStartSLOduration=2.457340507 podStartE2EDuration="15.860542162s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.690195848 +0000 UTC m=+1126.176098211" lastFinishedPulling="2026-02-01 07:06:29.093397503 +0000 UTC m=+1139.579299866" observedRunningTime="2026-02-01 07:06:29.855334381 +0000 UTC m=+1140.341236734" watchObservedRunningTime="2026-02-01 07:06:29.860542162 +0000 UTC m=+1140.346444525" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.928433 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" podStartSLOduration=2.707057084 podStartE2EDuration="15.928413699s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.879062198 +0000 UTC m=+1126.364964561" lastFinishedPulling="2026-02-01 07:06:29.100418803 +0000 UTC m=+1139.586321176" observedRunningTime="2026-02-01 07:06:29.896530216 +0000 UTC m=+1140.382432579" watchObservedRunningTime="2026-02-01 07:06:29.928413699 +0000 UTC m=+1140.414316062" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.959364 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" podStartSLOduration=2.7443796750000002 podStartE2EDuration="15.959348146s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.876049017 +0000 UTC m=+1126.361951380" lastFinishedPulling="2026-02-01 07:06:29.091017488 +0000 UTC m=+1139.576919851" observedRunningTime="2026-02-01 07:06:29.958717989 +0000 UTC m=+1140.444620352" watchObservedRunningTime="2026-02-01 07:06:29.959348146 +0000 UTC m=+1140.445250499" Feb 01 07:06:29 crc kubenswrapper[5127]: I0201 07:06:29.960011 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" podStartSLOduration=3.013079644 podStartE2EDuration="15.960006204s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.153462023 +0000 UTC m=+1126.639364396" lastFinishedPulling="2026-02-01 07:06:29.100388593 +0000 UTC m=+1139.586290956" observedRunningTime="2026-02-01 07:06:29.939120849 +0000 UTC m=+1140.425023282" watchObservedRunningTime="2026-02-01 07:06:29.960006204 +0000 UTC m=+1140.445908567" Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.398692 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.399197 5127 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.399260 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert podName:181c451d-b9c2-4b75-b271-a3d33fc7c200 nodeName:}" failed. No retries permitted until 2026-02-01 07:06:46.39923959 +0000 UTC m=+1156.885141953 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert") pod "infra-operator-controller-manager-79955696d6-hv9b6" (UID: "181c451d-b9c2-4b75-b271-a3d33fc7c200") : secret "infra-operator-webhook-server-cert" not found Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.500641 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.500890 5127 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.500937 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert podName:a6886643-fe68-466d-ab2f-0dfd752dbe0f nodeName:}" failed. No retries permitted until 2026-02-01 07:06:46.500921632 +0000 UTC m=+1156.986823995 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert") pod "openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" (UID: "a6886643-fe68-466d-ab2f-0dfd752dbe0f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.718883 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" event={"ID":"464ffa34-bb5b-4e78-9fa1-d106fd67de1d","Type":"ContainerStarted","Data":"4ba5ee6732da22572b4191c97a4e3158db6966223ef650df4a34f5837f0ed210"} Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.719230 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.724052 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" event={"ID":"a050d4cf-e8ae-4983-aeef-5504bd4ffdc3","Type":"ContainerStarted","Data":"2e39a389c6c28eed21f8584712dc46b6f18597f02600efbb696f2e78449b27a5"} Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.724444 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.732335 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" event={"ID":"d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d","Type":"ContainerStarted","Data":"99718d398a96d28bc36b348e846e9fc78f40d59f623588a7677b0fabd244e27f"} Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.745408 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" podStartSLOduration=3.515460229 podStartE2EDuration="16.745394457s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.862064448 +0000 UTC m=+1126.347966811" lastFinishedPulling="2026-02-01 07:06:29.091998676 +0000 UTC m=+1139.577901039" observedRunningTime="2026-02-01 07:06:30.744895814 +0000 UTC m=+1141.230798177" watchObservedRunningTime="2026-02-01 07:06:30.745394457 +0000 UTC m=+1141.231296820" Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.770939 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" podStartSLOduration=3.8192035090000003 podStartE2EDuration="16.770921918s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.142705202 +0000 UTC m=+1126.628607565" lastFinishedPulling="2026-02-01 07:06:29.094423601 +0000 UTC m=+1139.580325974" observedRunningTime="2026-02-01 07:06:30.76656367 +0000 UTC m=+1141.252466033" watchObservedRunningTime="2026-02-01 07:06:30.770921918 +0000 UTC m=+1141.256824281" Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.908921 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:30 crc kubenswrapper[5127]: I0201 07:06:30.908980 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.909149 5127 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.909194 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:46.909182129 +0000 UTC m=+1157.395084492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "webhook-server-cert" not found Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.909485 5127 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 01 07:06:30 crc kubenswrapper[5127]: E0201 07:06:30.909512 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs podName:7205026a-cd8b-4d94-9581-9fb0b21c5c4c nodeName:}" failed. No retries permitted until 2026-02-01 07:06:46.909505198 +0000 UTC m=+1157.395407561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs") pod "openstack-operator-controller-manager-6b6f655c79-8hfvm" (UID: "7205026a-cd8b-4d94-9581-9fb0b21c5c4c") : secret "metrics-server-cert" not found Feb 01 07:06:31 crc kubenswrapper[5127]: I0201 07:06:31.741367 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.464263 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-msn79" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.478457 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-5qgzf" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.505488 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-c5r2v" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.515418 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" podStartSLOduration=7.149406876 podStartE2EDuration="20.515387815s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.728514314 +0000 UTC m=+1126.214416677" lastFinishedPulling="2026-02-01 07:06:29.094495233 +0000 UTC m=+1139.580397616" observedRunningTime="2026-02-01 07:06:30.784340531 +0000 UTC m=+1141.270242894" watchObservedRunningTime="2026-02-01 07:06:34.515387815 +0000 UTC m=+1145.001290228" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.780308 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-lj688" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.807071 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-kssvc" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.885310 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-sjjbc" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.908722 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-6z59p" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.935403 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-5wvqj" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.941179 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-v8fwx" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.981369 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-646f6" Feb 01 07:06:34 crc kubenswrapper[5127]: I0201 07:06:34.989801 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-gv8nn" Feb 01 07:06:35 crc kubenswrapper[5127]: I0201 07:06:35.017045 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ll87b" Feb 01 07:06:35 crc kubenswrapper[5127]: I0201 07:06:35.452230 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-97rgn" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.831910 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" event={"ID":"be2cd21c-5775-450d-9933-9914e99730a6","Type":"ContainerStarted","Data":"d84d2f66cf0319eb1076e56a3e1a4ccff3964d509cb410430990e7f28814fc0f"} Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.832927 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" event={"ID":"cb22bae8-ed04-41f1-8061-149713da4d9f","Type":"ContainerStarted","Data":"9324276e85ad24949b656f7407bb52ce46a41e59b5f14e2b99f9ec012f793550"} Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.833052 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.833238 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.834155 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" event={"ID":"144ee88b-a5c7-46da-9e39-8f3c71d9499d","Type":"ContainerStarted","Data":"dedf7602ad65d22d66de1605411cc1d00c58ce50a37491699d21e7a82de87afa"} Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.835051 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" event={"ID":"f972515b-c0d8-497e-87e8-ec5a8f3e4151","Type":"ContainerStarted","Data":"113380e4dfc1c1efe57f6fa2ad5d85a20ce47a83b2a62439042fd63681a31327"} Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.835207 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.836190 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" event={"ID":"c25dac7e-0533-4ac5-9fc8-cabbf5e340bc","Type":"ContainerStarted","Data":"aa04ffa10fd02a02fca7f9692e0b6dba22e3ee8a183f5b2707a48b5bfa064421"} Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.836593 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.848700 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" podStartSLOduration=3.3761945620000002 podStartE2EDuration="27.848677727s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.266300327 +0000 UTC m=+1126.752202690" lastFinishedPulling="2026-02-01 07:06:40.738783492 +0000 UTC m=+1151.224685855" observedRunningTime="2026-02-01 07:06:41.847974387 +0000 UTC m=+1152.333876790" watchObservedRunningTime="2026-02-01 07:06:41.848677727 +0000 UTC m=+1152.334580100" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.874400 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" podStartSLOduration=3.446509395 podStartE2EDuration="27.874373062s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.331986805 +0000 UTC m=+1126.817889168" lastFinishedPulling="2026-02-01 07:06:40.759850452 +0000 UTC m=+1151.245752835" observedRunningTime="2026-02-01 07:06:41.863060106 +0000 UTC m=+1152.348962499" watchObservedRunningTime="2026-02-01 07:06:41.874373062 +0000 UTC m=+1152.360275455" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.882497 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-bbcxg" podStartSLOduration=3.441011885 podStartE2EDuration="27.882478191s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.317558564 +0000 UTC m=+1126.803460927" lastFinishedPulling="2026-02-01 07:06:40.75902487 +0000 UTC m=+1151.244927233" observedRunningTime="2026-02-01 07:06:41.877958858 +0000 UTC m=+1152.363861231" watchObservedRunningTime="2026-02-01 07:06:41.882478191 +0000 UTC m=+1152.368380574" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.899055 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" podStartSLOduration=3.419619225 podStartE2EDuration="27.899035988s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.265891966 +0000 UTC m=+1126.751794329" lastFinishedPulling="2026-02-01 07:06:40.745308709 +0000 UTC m=+1151.231211092" observedRunningTime="2026-02-01 07:06:41.896348107 +0000 UTC m=+1152.382250480" watchObservedRunningTime="2026-02-01 07:06:41.899035988 +0000 UTC m=+1152.384938371" Feb 01 07:06:41 crc kubenswrapper[5127]: I0201 07:06:41.921455 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" podStartSLOduration=3.491298795 podStartE2EDuration="27.921436614s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.313387381 +0000 UTC m=+1126.799289744" lastFinishedPulling="2026-02-01 07:06:40.7435252 +0000 UTC m=+1151.229427563" observedRunningTime="2026-02-01 07:06:41.919817841 +0000 UTC m=+1152.405720214" watchObservedRunningTime="2026-02-01 07:06:41.921436614 +0000 UTC m=+1152.407338987" Feb 01 07:06:42 crc kubenswrapper[5127]: I0201 07:06:42.237723 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:06:42 crc kubenswrapper[5127]: I0201 07:06:42.849906 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" event={"ID":"43e73360-cfda-420c-8df1-fe2e50b31d0c","Type":"ContainerStarted","Data":"d80ea967e1cd802117cebf57cca4beea28fc47d8b56ccbdfefb925ac8710b674"} Feb 01 07:06:42 crc kubenswrapper[5127]: I0201 07:06:42.852111 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" Feb 01 07:06:42 crc kubenswrapper[5127]: I0201 07:06:42.884911 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" podStartSLOduration=3.436116593 podStartE2EDuration="28.884885956s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:16.264854188 +0000 UTC m=+1126.750756551" lastFinishedPulling="2026-02-01 07:06:41.713623541 +0000 UTC m=+1152.199525914" observedRunningTime="2026-02-01 07:06:42.878241087 +0000 UTC m=+1153.364143470" watchObservedRunningTime="2026-02-01 07:06:42.884885956 +0000 UTC m=+1153.370788339" Feb 01 07:06:43 crc kubenswrapper[5127]: I0201 07:06:43.857680 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" event={"ID":"020efd87-3f4e-4762-9853-4f08d7f744cd","Type":"ContainerStarted","Data":"83d0a2ff9c5a851f68870a913f1f767cdc3c05745fa84d960738586300cd11b6"} Feb 01 07:06:43 crc kubenswrapper[5127]: I0201 07:06:43.857961 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" Feb 01 07:06:43 crc kubenswrapper[5127]: I0201 07:06:43.887460 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" podStartSLOduration=2.595367972 podStartE2EDuration="29.887436485s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:15.421398294 +0000 UTC m=+1125.907300657" lastFinishedPulling="2026-02-01 07:06:42.713466797 +0000 UTC m=+1153.199369170" observedRunningTime="2026-02-01 07:06:43.877437325 +0000 UTC m=+1154.363339698" watchObservedRunningTime="2026-02-01 07:06:43.887436485 +0000 UTC m=+1154.373338858" Feb 01 07:06:45 crc kubenswrapper[5127]: I0201 07:06:45.047235 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-2lrrd" Feb 01 07:06:45 crc kubenswrapper[5127]: I0201 07:06:45.048175 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-wgpr9" Feb 01 07:06:45 crc kubenswrapper[5127]: I0201 07:06:45.108069 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-kwkzq" Feb 01 07:06:45 crc kubenswrapper[5127]: I0201 07:06:45.353183 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-c89bv" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.468080 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.477265 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/181c451d-b9c2-4b75-b271-a3d33fc7c200-cert\") pod \"infra-operator-controller-manager-79955696d6-hv9b6\" (UID: \"181c451d-b9c2-4b75-b271-a3d33fc7c200\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.569213 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.573942 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6886643-fe68-466d-ab2f-0dfd752dbe0f-cert\") pod \"openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf\" (UID: \"a6886643-fe68-466d-ab2f-0dfd752dbe0f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.680090 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-qs6z4" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.687179 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.760647 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nszkn" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.768799 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.977201 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.977275 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.979915 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6"] Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.984329 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-metrics-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:46 crc kubenswrapper[5127]: I0201 07:06:46.984901 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7205026a-cd8b-4d94-9581-9fb0b21c5c4c-webhook-certs\") pod \"openstack-operator-controller-manager-6b6f655c79-8hfvm\" (UID: \"7205026a-cd8b-4d94-9581-9fb0b21c5c4c\") " pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.052779 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf"] Feb 01 07:06:47 crc kubenswrapper[5127]: W0201 07:06:47.053471 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6886643_fe68_466d_ab2f_0dfd752dbe0f.slice/crio-f5d9b31d9a9a06236f21f14609171cd603ec2be582d7b8a2e453088699f03248 WatchSource:0}: Error finding container f5d9b31d9a9a06236f21f14609171cd603ec2be582d7b8a2e453088699f03248: Status 404 returned error can't find the container with id f5d9b31d9a9a06236f21f14609171cd603ec2be582d7b8a2e453088699f03248 Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.257714 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-jb97g" Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.265296 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.570896 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm"] Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.887096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" event={"ID":"181c451d-b9c2-4b75-b271-a3d33fc7c200","Type":"ContainerStarted","Data":"2fa0400697b52334cd5a2656224a465f51c06dbd094c9f4472d6ff1c597bec41"} Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.888427 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" event={"ID":"a6886643-fe68-466d-ab2f-0dfd752dbe0f","Type":"ContainerStarted","Data":"f5d9b31d9a9a06236f21f14609171cd603ec2be582d7b8a2e453088699f03248"} Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.889893 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" event={"ID":"7205026a-cd8b-4d94-9581-9fb0b21c5c4c","Type":"ContainerStarted","Data":"edb94c15b5f28415876f6d8a67c6399ff34dca71a46b09f7a3c1993161c60769"} Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.889936 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" event={"ID":"7205026a-cd8b-4d94-9581-9fb0b21c5c4c","Type":"ContainerStarted","Data":"11c73ca2370cdb48b795884e511ec9c35a174c43e733e9bd360fd3f6527bb8e3"} Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.890029 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:06:47 crc kubenswrapper[5127]: I0201 07:06:47.920545 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" podStartSLOduration=33.920515472 podStartE2EDuration="33.920515472s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:06:47.915370193 +0000 UTC m=+1158.401272556" watchObservedRunningTime="2026-02-01 07:06:47.920515472 +0000 UTC m=+1158.406417835" Feb 01 07:06:49 crc kubenswrapper[5127]: I0201 07:06:49.908998 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" event={"ID":"a6886643-fe68-466d-ab2f-0dfd752dbe0f","Type":"ContainerStarted","Data":"2bb50c6b1c8e099dac848275dff8f940d3c8f81416565e2f11a502117e06e3fa"} Feb 01 07:06:49 crc kubenswrapper[5127]: I0201 07:06:49.909481 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:49 crc kubenswrapper[5127]: I0201 07:06:49.913490 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" event={"ID":"181c451d-b9c2-4b75-b271-a3d33fc7c200","Type":"ContainerStarted","Data":"effcec32fb826d296dce3dd44de6c11832d18c39a330a1a763f3909702c97acc"} Feb 01 07:06:49 crc kubenswrapper[5127]: I0201 07:06:49.913641 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:49 crc kubenswrapper[5127]: I0201 07:06:49.964902 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" podStartSLOduration=33.61478853 podStartE2EDuration="35.964866984s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:47.056185263 +0000 UTC m=+1157.542087626" lastFinishedPulling="2026-02-01 07:06:49.406263717 +0000 UTC m=+1159.892166080" observedRunningTime="2026-02-01 07:06:49.948952553 +0000 UTC m=+1160.434854956" watchObservedRunningTime="2026-02-01 07:06:49.964866984 +0000 UTC m=+1160.450769387" Feb 01 07:06:49 crc kubenswrapper[5127]: I0201 07:06:49.972528 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" podStartSLOduration=33.559445422 podStartE2EDuration="35.97250438s" podCreationTimestamp="2026-02-01 07:06:14 +0000 UTC" firstStartedPulling="2026-02-01 07:06:46.990105305 +0000 UTC m=+1157.476007688" lastFinishedPulling="2026-02-01 07:06:49.403164283 +0000 UTC m=+1159.889066646" observedRunningTime="2026-02-01 07:06:49.968007808 +0000 UTC m=+1160.453910171" watchObservedRunningTime="2026-02-01 07:06:49.97250438 +0000 UTC m=+1160.458406783" Feb 01 07:06:54 crc kubenswrapper[5127]: I0201 07:06:54.607627 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-k59z5" Feb 01 07:06:55 crc kubenswrapper[5127]: I0201 07:06:55.290737 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-9r2rh" Feb 01 07:06:56 crc kubenswrapper[5127]: I0201 07:06:56.695882 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf" Feb 01 07:06:56 crc kubenswrapper[5127]: I0201 07:06:56.775201 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-hv9b6" Feb 01 07:06:57 crc kubenswrapper[5127]: I0201 07:06:57.271294 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b6f655c79-8hfvm" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.127085 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-5sqqv"] Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.128615 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.136829 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.137104 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.137279 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.137880 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bksbn" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.143004 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-5sqqv"] Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.201294 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-m7qnj"] Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.202501 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.204379 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.219303 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-m7qnj"] Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.248493 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249ffd47-8836-46ca-b80c-9fa90cceea62-config\") pod \"dnsmasq-dns-84bb9d8bd9-5sqqv\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.248549 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb7sr\" (UniqueName: \"kubernetes.io/projected/249ffd47-8836-46ca-b80c-9fa90cceea62-kube-api-access-vb7sr\") pod \"dnsmasq-dns-84bb9d8bd9-5sqqv\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.248574 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-config\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.248618 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-dns-svc\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.248720 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d8g9\" (UniqueName: \"kubernetes.io/projected/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-kube-api-access-2d8g9\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.350834 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d8g9\" (UniqueName: \"kubernetes.io/projected/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-kube-api-access-2d8g9\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.350902 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249ffd47-8836-46ca-b80c-9fa90cceea62-config\") pod \"dnsmasq-dns-84bb9d8bd9-5sqqv\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.350926 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb7sr\" (UniqueName: \"kubernetes.io/projected/249ffd47-8836-46ca-b80c-9fa90cceea62-kube-api-access-vb7sr\") pod \"dnsmasq-dns-84bb9d8bd9-5sqqv\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.350955 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-config\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.350992 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-dns-svc\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.352053 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-dns-svc\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.352185 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249ffd47-8836-46ca-b80c-9fa90cceea62-config\") pod \"dnsmasq-dns-84bb9d8bd9-5sqqv\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.352622 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-config\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.372678 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb7sr\" (UniqueName: \"kubernetes.io/projected/249ffd47-8836-46ca-b80c-9fa90cceea62-kube-api-access-vb7sr\") pod \"dnsmasq-dns-84bb9d8bd9-5sqqv\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.373255 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d8g9\" (UniqueName: \"kubernetes.io/projected/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-kube-api-access-2d8g9\") pod \"dnsmasq-dns-5f854695bc-m7qnj\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.482186 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.519141 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.795197 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-5sqqv"] Feb 01 07:07:11 crc kubenswrapper[5127]: I0201 07:07:11.874712 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-m7qnj"] Feb 01 07:07:12 crc kubenswrapper[5127]: I0201 07:07:12.074763 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" event={"ID":"249ffd47-8836-46ca-b80c-9fa90cceea62","Type":"ContainerStarted","Data":"f5c882078df39e4eb5f52fa3d52286aaf87b5e2feaaca387c96d95fcc37990bd"} Feb 01 07:07:12 crc kubenswrapper[5127]: I0201 07:07:12.076042 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" event={"ID":"b0c05dba-131a-4ed9-9895-67f80ab1e1f7","Type":"ContainerStarted","Data":"e670ca7341ab64a1571f5fde5230d089b0a461a8b0fa5be4cf5721169c7385f3"} Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.283549 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-m7qnj"] Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.297222 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-wqxk7"] Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.298347 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.314029 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-wqxk7"] Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.490765 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g4gj\" (UniqueName: \"kubernetes.io/projected/90177717-535c-4840-9570-50d4f8363937-kube-api-access-9g4gj\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.491179 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-config\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.491213 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.593096 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g4gj\" (UniqueName: \"kubernetes.io/projected/90177717-535c-4840-9570-50d4f8363937-kube-api-access-9g4gj\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.593184 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-config\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.593217 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.594258 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.594427 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-config\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.610373 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g4gj\" (UniqueName: \"kubernetes.io/projected/90177717-535c-4840-9570-50d4f8363937-kube-api-access-9g4gj\") pod \"dnsmasq-dns-744ffd65bc-wqxk7\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.659932 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:13 crc kubenswrapper[5127]: I0201 07:07:13.906526 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-wqxk7"] Feb 01 07:07:13 crc kubenswrapper[5127]: W0201 07:07:13.931399 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90177717_535c_4840_9570_50d4f8363937.slice/crio-8eaeeb7bc61b30b06dbb903466b55acab4bc1f8ca1f082a6d2f93ecb1ca28d2f WatchSource:0}: Error finding container 8eaeeb7bc61b30b06dbb903466b55acab4bc1f8ca1f082a6d2f93ecb1ca28d2f: Status 404 returned error can't find the container with id 8eaeeb7bc61b30b06dbb903466b55acab4bc1f8ca1f082a6d2f93ecb1ca28d2f Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.028186 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-5sqqv"] Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.047447 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9g5pj"] Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.048529 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.067716 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9g5pj"] Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.096571 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" event={"ID":"90177717-535c-4840-9570-50d4f8363937","Type":"ContainerStarted","Data":"8eaeeb7bc61b30b06dbb903466b55acab4bc1f8ca1f082a6d2f93ecb1ca28d2f"} Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.206366 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7sc5\" (UniqueName: \"kubernetes.io/projected/8a6e0a90-6e84-4065-89b5-fc45b01d5970-kube-api-access-z7sc5\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.206407 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-dns-svc\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.206451 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-config\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.308048 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7sc5\" (UniqueName: \"kubernetes.io/projected/8a6e0a90-6e84-4065-89b5-fc45b01d5970-kube-api-access-z7sc5\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.308096 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-dns-svc\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.308139 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-config\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.309054 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-config\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.309737 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-dns-svc\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.329299 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7sc5\" (UniqueName: \"kubernetes.io/projected/8a6e0a90-6e84-4065-89b5-fc45b01d5970-kube-api-access-z7sc5\") pod \"dnsmasq-dns-95f5f6995-9g5pj\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.379239 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.426791 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.427899 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.429982 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.430006 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.430382 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.430661 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.430825 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wd9jl" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.431012 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.432952 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.442348 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.613953 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.614470 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvkx9\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-kube-api-access-hvkx9\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.614545 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.614614 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.614693 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.614728 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.614809 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.614853 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-pod-info\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.615015 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-server-conf\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.615107 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.615134 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716300 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvkx9\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-kube-api-access-hvkx9\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716364 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716380 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716413 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716434 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716488 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-pod-info\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716511 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-server-conf\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716537 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.716554 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.717038 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.717435 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.733314 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.733887 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.737015 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.737237 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.740108 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.742641 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-pod-info\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.743460 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-server-conf\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.743989 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.752130 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.756115 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvkx9\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-kube-api-access-hvkx9\") pod \"rabbitmq-server-0\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.788923 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:07:14 crc kubenswrapper[5127]: I0201 07:07:14.825454 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9g5pj"] Feb 01 07:07:14 crc kubenswrapper[5127]: W0201 07:07:14.830607 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a6e0a90_6e84_4065_89b5_fc45b01d5970.slice/crio-fa4343c8e4d7e6f16ba1e62365e74c02b75c7aaba0ceecbb4a05de979da46358 WatchSource:0}: Error finding container fa4343c8e4d7e6f16ba1e62365e74c02b75c7aaba0ceecbb4a05de979da46358: Status 404 returned error can't find the container with id fa4343c8e4d7e6f16ba1e62365e74c02b75c7aaba0ceecbb4a05de979da46358 Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.105549 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" event={"ID":"8a6e0a90-6e84-4065-89b5-fc45b01d5970","Type":"ContainerStarted","Data":"fa4343c8e4d7e6f16ba1e62365e74c02b75c7aaba0ceecbb4a05de979da46358"} Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.175272 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.176636 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.182394 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.182732 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.182847 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.182948 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.183039 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.184338 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.184645 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-299p6" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.207918 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.248725 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327167 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327234 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7djxw\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-kube-api-access-7djxw\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327306 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327339 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/824fc658-1c02-4470-9ed3-e4123ddd7575-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327363 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327380 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327400 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327416 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327447 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327503 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.327519 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/824fc658-1c02-4470-9ed3-e4123ddd7575-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431184 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431222 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431305 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431371 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431388 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/824fc658-1c02-4470-9ed3-e4123ddd7575-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431432 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431459 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7djxw\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-kube-api-access-7djxw\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431542 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431568 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/824fc658-1c02-4470-9ed3-e4123ddd7575-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431605 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.431625 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.435434 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.435920 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.436455 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.436870 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.437699 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.440367 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/824fc658-1c02-4470-9ed3-e4123ddd7575-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.440487 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.441596 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.442047 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.444088 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/824fc658-1c02-4470-9ed3-e4123ddd7575-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.457209 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7djxw\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-kube-api-access-7djxw\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.474648 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:15 crc kubenswrapper[5127]: I0201 07:07:15.516094 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.553952 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.555645 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.558075 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.558719 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ccz6b" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.560254 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.560568 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.571720 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.573379 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658457 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658508 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658533 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-kolla-config\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658708 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658744 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658780 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5vf\" (UniqueName: \"kubernetes.io/projected/011ed99a-688f-4874-b6f7-f861080ef9d5-kube-api-access-pb5vf\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658820 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-default\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.658850 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.760760 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761231 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761265 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5vf\" (UniqueName: \"kubernetes.io/projected/011ed99a-688f-4874-b6f7-f861080ef9d5-kube-api-access-pb5vf\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761301 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-default\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761328 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761358 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761374 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761390 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-kolla-config\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.762514 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-default\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.761169 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.763117 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.763533 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-kolla-config\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.764368 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.766166 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.778467 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.790892 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5vf\" (UniqueName: \"kubernetes.io/projected/011ed99a-688f-4874-b6f7-f861080ef9d5-kube-api-access-pb5vf\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.792499 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " pod="openstack/openstack-galera-0" Feb 01 07:07:16 crc kubenswrapper[5127]: I0201 07:07:16.881656 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.906182 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.908956 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.923994 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.924255 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ngv8d" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.924473 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.924613 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.924803 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.977632 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.977676 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.977706 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696ct\" (UniqueName: \"kubernetes.io/projected/02abfc06-bde0-4894-a5f8-f07207f1ba28-kube-api-access-696ct\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.977726 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.977748 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.978038 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.978262 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:17 crc kubenswrapper[5127]: I0201 07:07:17.978352 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080295 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080442 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080485 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080566 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080623 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080655 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-696ct\" (UniqueName: \"kubernetes.io/projected/02abfc06-bde0-4894-a5f8-f07207f1ba28-kube-api-access-696ct\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080683 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.080730 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.081141 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.081724 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.082657 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.083252 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.086483 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.089187 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.090980 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.112906 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-696ct\" (UniqueName: \"kubernetes.io/projected/02abfc06-bde0-4894-a5f8-f07207f1ba28-kube-api-access-696ct\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.132234 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.258503 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.264672 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.265867 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.268404 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.268446 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vprl6" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.268468 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.285261 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.388173 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.388232 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-config-data\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.388292 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.388320 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kolla-config\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.388350 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrswp\" (UniqueName: \"kubernetes.io/projected/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kube-api-access-rrswp\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.489862 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrswp\" (UniqueName: \"kubernetes.io/projected/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kube-api-access-rrswp\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.489945 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.489977 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-config-data\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.490028 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.490049 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kolla-config\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.490940 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kolla-config\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.494391 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-config-data\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.496043 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-memcached-tls-certs\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.496527 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-combined-ca-bundle\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.514296 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrswp\" (UniqueName: \"kubernetes.io/projected/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kube-api-access-rrswp\") pod \"memcached-0\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: I0201 07:07:18.582539 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 07:07:18 crc kubenswrapper[5127]: W0201 07:07:18.935837 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23799dc8_9944_4c3d_a0e1_cf99f5cb7998.slice/crio-5ea2fea9b3ea9c3b8acac3ba2ee394c286b0ef56b1eb9cf2a409be6533288b3a WatchSource:0}: Error finding container 5ea2fea9b3ea9c3b8acac3ba2ee394c286b0ef56b1eb9cf2a409be6533288b3a: Status 404 returned error can't find the container with id 5ea2fea9b3ea9c3b8acac3ba2ee394c286b0ef56b1eb9cf2a409be6533288b3a Feb 01 07:07:19 crc kubenswrapper[5127]: I0201 07:07:19.137139 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"23799dc8-9944-4c3d-a0e1-cf99f5cb7998","Type":"ContainerStarted","Data":"5ea2fea9b3ea9c3b8acac3ba2ee394c286b0ef56b1eb9cf2a409be6533288b3a"} Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.186029 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.187172 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.188914 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wjqwg" Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.198573 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.318496 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szfpc\" (UniqueName: \"kubernetes.io/projected/cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf-kube-api-access-szfpc\") pod \"kube-state-metrics-0\" (UID: \"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf\") " pod="openstack/kube-state-metrics-0" Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.420173 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szfpc\" (UniqueName: \"kubernetes.io/projected/cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf-kube-api-access-szfpc\") pod \"kube-state-metrics-0\" (UID: \"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf\") " pod="openstack/kube-state-metrics-0" Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.451345 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szfpc\" (UniqueName: \"kubernetes.io/projected/cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf-kube-api-access-szfpc\") pod \"kube-state-metrics-0\" (UID: \"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf\") " pod="openstack/kube-state-metrics-0" Feb 01 07:07:20 crc kubenswrapper[5127]: I0201 07:07:20.506959 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.578315 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hqn86"] Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.582030 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.595195 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.595518 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.595740 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xn2x9" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.609344 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9przj"] Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.610821 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.621322 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hqn86"] Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.629875 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9przj"] Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.697907 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-log-ovn\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.697961 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57jr5\" (UniqueName: \"kubernetes.io/projected/4b0be460-5699-4787-9c9e-90df6400faed-kube-api-access-57jr5\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.697987 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-log\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698017 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-run\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698039 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-etc-ovs\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698091 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw22w\" (UniqueName: \"kubernetes.io/projected/a3845481-effe-4cb2-9249-e9311df519a0-kube-api-access-lw22w\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698107 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-combined-ca-bundle\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698121 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-lib\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698142 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b0be460-5699-4787-9c9e-90df6400faed-scripts\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698175 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698196 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3845481-effe-4cb2-9249-e9311df519a0-scripts\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698219 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-ovn-controller-tls-certs\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.698237 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run-ovn\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800264 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-log-ovn\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800327 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57jr5\" (UniqueName: \"kubernetes.io/projected/4b0be460-5699-4787-9c9e-90df6400faed-kube-api-access-57jr5\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800352 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-log\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800375 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-run\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800391 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-etc-ovs\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800415 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-combined-ca-bundle\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800432 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw22w\" (UniqueName: \"kubernetes.io/projected/a3845481-effe-4cb2-9249-e9311df519a0-kube-api-access-lw22w\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800451 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-lib\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b0be460-5699-4787-9c9e-90df6400faed-scripts\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800487 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800505 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3845481-effe-4cb2-9249-e9311df519a0-scripts\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800526 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-ovn-controller-tls-certs\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.800541 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run-ovn\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.801026 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run-ovn\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.801145 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-log-ovn\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.801486 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-log\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.801561 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-run\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.801676 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-etc-ovs\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.803992 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3845481-effe-4cb2-9249-e9311df519a0-scripts\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.804050 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.804729 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-lib\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.813370 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-combined-ca-bundle\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.813374 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-ovn-controller-tls-certs\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.813697 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b0be460-5699-4787-9c9e-90df6400faed-scripts\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.834314 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57jr5\" (UniqueName: \"kubernetes.io/projected/4b0be460-5699-4787-9c9e-90df6400faed-kube-api-access-57jr5\") pod \"ovn-controller-hqn86\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.844178 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw22w\" (UniqueName: \"kubernetes.io/projected/a3845481-effe-4cb2-9249-e9311df519a0-kube-api-access-lw22w\") pod \"ovn-controller-ovs-9przj\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.910321 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86" Feb 01 07:07:24 crc kubenswrapper[5127]: I0201 07:07:24.936515 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.235355 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.236653 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.239154 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-42xh5" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.239431 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.239559 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.239704 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.240310 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.251182 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.307907 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4wnn\" (UniqueName: \"kubernetes.io/projected/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-kube-api-access-c4wnn\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.308037 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.308066 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.308210 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.308262 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.308347 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.308474 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.308518 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409683 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409727 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409783 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409802 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409831 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409861 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409876 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.409898 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4wnn\" (UniqueName: \"kubernetes.io/projected/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-kube-api-access-c4wnn\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.411691 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.412315 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.412503 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.413915 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.414450 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.414898 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.417448 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.424966 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4wnn\" (UniqueName: \"kubernetes.io/projected/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-kube-api-access-c4wnn\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.431158 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:25 crc kubenswrapper[5127]: I0201 07:07:25.568351 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.378403 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.379768 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.385009 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6d857" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.385152 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.385226 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.385357 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.393805 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549507 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549548 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549596 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549656 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549680 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-config\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549709 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k955w\" (UniqueName: \"kubernetes.io/projected/d523dcf2-c3fd-4473-ae9b-27e64a77205d-kube-api-access-k955w\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549897 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.549975 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651466 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651524 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-config\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651553 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k955w\" (UniqueName: \"kubernetes.io/projected/d523dcf2-c3fd-4473-ae9b-27e64a77205d-kube-api-access-k955w\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651598 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651624 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651656 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651674 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.651701 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.654267 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.655805 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.657336 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-config\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.659542 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.674272 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.676784 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.679689 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.687995 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k955w\" (UniqueName: \"kubernetes.io/projected/d523dcf2-c3fd-4473-ae9b-27e64a77205d-kube-api-access-k955w\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.696739 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:27 crc kubenswrapper[5127]: I0201 07:07:27.699309 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:28 crc kubenswrapper[5127]: E0201 07:07:28.312004 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Feb 01 07:07:28 crc kubenswrapper[5127]: E0201 07:07:28.312520 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2d8g9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-m7qnj_openstack(b0c05dba-131a-4ed9-9895-67f80ab1e1f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 07:07:28 crc kubenswrapper[5127]: E0201 07:07:28.314150 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" podUID="b0c05dba-131a-4ed9-9895-67f80ab1e1f7" Feb 01 07:07:28 crc kubenswrapper[5127]: E0201 07:07:28.387404 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Feb 01 07:07:28 crc kubenswrapper[5127]: E0201 07:07:28.387659 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vb7sr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-5sqqv_openstack(249ffd47-8836-46ca-b80c-9fa90cceea62): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 07:07:28 crc kubenswrapper[5127]: E0201 07:07:28.388852 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" podUID="249ffd47-8836-46ca-b80c-9fa90cceea62" Feb 01 07:07:28 crc kubenswrapper[5127]: I0201 07:07:28.584000 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 07:07:28 crc kubenswrapper[5127]: I0201 07:07:28.878004 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:07:28 crc kubenswrapper[5127]: I0201 07:07:28.888025 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 01 07:07:29 crc kubenswrapper[5127]: W0201 07:07:29.808729 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd440b432_d2ce_4228_90b7_ad0c2e12ec86.slice/crio-41e902a000bb6082673625e78feb3483ce904397675767e01b6a4cc5c37379c8 WatchSource:0}: Error finding container 41e902a000bb6082673625e78feb3483ce904397675767e01b6a4cc5c37379c8: Status 404 returned error can't find the container with id 41e902a000bb6082673625e78feb3483ce904397675767e01b6a4cc5c37379c8 Feb 01 07:07:29 crc kubenswrapper[5127]: W0201 07:07:29.813777 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod011ed99a_688f_4874_b6f7_f861080ef9d5.slice/crio-3308442a8576384d0e401106f1cd9598832e0b35eafc0ea16cae645cbc9df338 WatchSource:0}: Error finding container 3308442a8576384d0e401106f1cd9598832e0b35eafc0ea16cae645cbc9df338: Status 404 returned error can't find the container with id 3308442a8576384d0e401106f1cd9598832e0b35eafc0ea16cae645cbc9df338 Feb 01 07:07:29 crc kubenswrapper[5127]: W0201 07:07:29.819115 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824fc658_1c02_4470_9ed3_e4123ddd7575.slice/crio-5ed72d99ca67c3ea63ade9c933fdf6ea0af43aed4fdd91e68c06aea804039233 WatchSource:0}: Error finding container 5ed72d99ca67c3ea63ade9c933fdf6ea0af43aed4fdd91e68c06aea804039233: Status 404 returned error can't find the container with id 5ed72d99ca67c3ea63ade9c933fdf6ea0af43aed4fdd91e68c06aea804039233 Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.064163 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.081959 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.188395 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 07:07:30 crc kubenswrapper[5127]: W0201 07:07:30.190611 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02abfc06_bde0_4894_a5f8_f07207f1ba28.slice/crio-dfc203b3e02671f2206836f97c4531db07136693a3d09e3a94605756f4f47dcf WatchSource:0}: Error finding container dfc203b3e02671f2206836f97c4531db07136693a3d09e3a94605756f4f47dcf: Status 404 returned error can't find the container with id dfc203b3e02671f2206836f97c4531db07136693a3d09e3a94605756f4f47dcf Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.198994 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb7sr\" (UniqueName: \"kubernetes.io/projected/249ffd47-8836-46ca-b80c-9fa90cceea62-kube-api-access-vb7sr\") pod \"249ffd47-8836-46ca-b80c-9fa90cceea62\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.199107 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-config\") pod \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.199144 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-dns-svc\") pod \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.201725 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249ffd47-8836-46ca-b80c-9fa90cceea62-config\") pod \"249ffd47-8836-46ca-b80c-9fa90cceea62\" (UID: \"249ffd47-8836-46ca-b80c-9fa90cceea62\") " Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.201784 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d8g9\" (UniqueName: \"kubernetes.io/projected/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-kube-api-access-2d8g9\") pod \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\" (UID: \"b0c05dba-131a-4ed9-9895-67f80ab1e1f7\") " Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.202175 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-config" (OuterVolumeSpecName: "config") pod "b0c05dba-131a-4ed9-9895-67f80ab1e1f7" (UID: "b0c05dba-131a-4ed9-9895-67f80ab1e1f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.203039 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.203287 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b0c05dba-131a-4ed9-9895-67f80ab1e1f7" (UID: "b0c05dba-131a-4ed9-9895-67f80ab1e1f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.203430 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249ffd47-8836-46ca-b80c-9fa90cceea62-config" (OuterVolumeSpecName: "config") pod "249ffd47-8836-46ca-b80c-9fa90cceea62" (UID: "249ffd47-8836-46ca-b80c-9fa90cceea62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.212114 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-kube-api-access-2d8g9" (OuterVolumeSpecName: "kube-api-access-2d8g9") pod "b0c05dba-131a-4ed9-9895-67f80ab1e1f7" (UID: "b0c05dba-131a-4ed9-9895-67f80ab1e1f7"). InnerVolumeSpecName "kube-api-access-2d8g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.212199 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249ffd47-8836-46ca-b80c-9fa90cceea62-kube-api-access-vb7sr" (OuterVolumeSpecName: "kube-api-access-vb7sr") pod "249ffd47-8836-46ca-b80c-9fa90cceea62" (UID: "249ffd47-8836-46ca-b80c-9fa90cceea62"). InnerVolumeSpecName "kube-api-access-vb7sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.223443 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02abfc06-bde0-4894-a5f8-f07207f1ba28","Type":"ContainerStarted","Data":"dfc203b3e02671f2206836f97c4531db07136693a3d09e3a94605756f4f47dcf"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.224735 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" event={"ID":"b0c05dba-131a-4ed9-9895-67f80ab1e1f7","Type":"ContainerDied","Data":"e670ca7341ab64a1571f5fde5230d089b0a461a8b0fa5be4cf5721169c7385f3"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.224800 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-m7qnj" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.227186 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"011ed99a-688f-4874-b6f7-f861080ef9d5","Type":"ContainerStarted","Data":"3308442a8576384d0e401106f1cd9598832e0b35eafc0ea16cae645cbc9df338"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.229199 5127 generic.go:334] "Generic (PLEG): container finished" podID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerID="eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf" exitCode=0 Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.229294 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" event={"ID":"8a6e0a90-6e84-4065-89b5-fc45b01d5970","Type":"ContainerDied","Data":"eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.236642 5127 generic.go:334] "Generic (PLEG): container finished" podID="90177717-535c-4840-9570-50d4f8363937" containerID="0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3" exitCode=0 Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.245173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" event={"ID":"90177717-535c-4840-9570-50d4f8363937","Type":"ContainerDied","Data":"0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.245860 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d440b432-d2ce-4228-90b7-ad0c2e12ec86","Type":"ContainerStarted","Data":"41e902a000bb6082673625e78feb3483ce904397675767e01b6a4cc5c37379c8"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.252543 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"824fc658-1c02-4470-9ed3-e4123ddd7575","Type":"ContainerStarted","Data":"5ed72d99ca67c3ea63ade9c933fdf6ea0af43aed4fdd91e68c06aea804039233"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.254660 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" event={"ID":"249ffd47-8836-46ca-b80c-9fa90cceea62","Type":"ContainerDied","Data":"f5c882078df39e4eb5f52fa3d52286aaf87b5e2feaaca387c96d95fcc37990bd"} Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.254740 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-5sqqv" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.310610 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.310635 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249ffd47-8836-46ca-b80c-9fa90cceea62-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.310644 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d8g9\" (UniqueName: \"kubernetes.io/projected/b0c05dba-131a-4ed9-9895-67f80ab1e1f7-kube-api-access-2d8g9\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.310654 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb7sr\" (UniqueName: \"kubernetes.io/projected/249ffd47-8836-46ca-b80c-9fa90cceea62-kube-api-access-vb7sr\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.415447 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hqn86"] Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.463652 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-5sqqv"] Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.482137 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-5sqqv"] Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.544679 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-m7qnj"] Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.557793 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-m7qnj"] Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.564869 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.661623 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 07:07:30 crc kubenswrapper[5127]: E0201 07:07:30.724358 5127 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 01 07:07:30 crc kubenswrapper[5127]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/90177717-535c-4840-9570-50d4f8363937/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 01 07:07:30 crc kubenswrapper[5127]: > podSandboxID="8eaeeb7bc61b30b06dbb903466b55acab4bc1f8ca1f082a6d2f93ecb1ca28d2f" Feb 01 07:07:30 crc kubenswrapper[5127]: E0201 07:07:30.727039 5127 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 01 07:07:30 crc kubenswrapper[5127]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9g4gj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-744ffd65bc-wqxk7_openstack(90177717-535c-4840-9570-50d4f8363937): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/90177717-535c-4840-9570-50d4f8363937/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 01 07:07:30 crc kubenswrapper[5127]: > logger="UnhandledError" Feb 01 07:07:30 crc kubenswrapper[5127]: E0201 07:07:30.728359 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/90177717-535c-4840-9570-50d4f8363937/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" podUID="90177717-535c-4840-9570-50d4f8363937" Feb 01 07:07:30 crc kubenswrapper[5127]: W0201 07:07:30.741857 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36a8a4ac_b308_4bb8_be43_dddca18b1bc1.slice/crio-8f8fa59ab6f1441bef02c359e839cede60b6e8dc026480d5e11ec944daf69e38 WatchSource:0}: Error finding container 8f8fa59ab6f1441bef02c359e839cede60b6e8dc026480d5e11ec944daf69e38: Status 404 returned error can't find the container with id 8f8fa59ab6f1441bef02c359e839cede60b6e8dc026480d5e11ec944daf69e38 Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.771510 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 07:07:30 crc kubenswrapper[5127]: W0201 07:07:30.774694 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd523dcf2_c3fd_4473_ae9b_27e64a77205d.slice/crio-1e4b2acc6d88e8efd1357b52f1b803a8dec8b480fd4d2cef1e7c7c216de91616 WatchSource:0}: Error finding container 1e4b2acc6d88e8efd1357b52f1b803a8dec8b480fd4d2cef1e7c7c216de91616: Status 404 returned error can't find the container with id 1e4b2acc6d88e8efd1357b52f1b803a8dec8b480fd4d2cef1e7c7c216de91616 Feb 01 07:07:30 crc kubenswrapper[5127]: I0201 07:07:30.850274 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9przj"] Feb 01 07:07:30 crc kubenswrapper[5127]: W0201 07:07:30.883280 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3845481_effe_4cb2_9249_e9311df519a0.slice/crio-b408ff1c80292ac5017e44c6075c08c00a28d68c4ade1b10246a67d7709fca75 WatchSource:0}: Error finding container b408ff1c80292ac5017e44c6075c08c00a28d68c4ade1b10246a67d7709fca75: Status 404 returned error can't find the container with id b408ff1c80292ac5017e44c6075c08c00a28d68c4ade1b10246a67d7709fca75 Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.264840 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"36a8a4ac-b308-4bb8-be43-dddca18b1bc1","Type":"ContainerStarted","Data":"8f8fa59ab6f1441bef02c359e839cede60b6e8dc026480d5e11ec944daf69e38"} Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.267057 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d523dcf2-c3fd-4473-ae9b-27e64a77205d","Type":"ContainerStarted","Data":"1e4b2acc6d88e8efd1357b52f1b803a8dec8b480fd4d2cef1e7c7c216de91616"} Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.269245 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf","Type":"ContainerStarted","Data":"2cc9dc9e4a17064cc38d9068fcaab64cdd0588ffc8e0f3a8abe5165aefc4c9d7"} Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.271266 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" event={"ID":"8a6e0a90-6e84-4065-89b5-fc45b01d5970","Type":"ContainerStarted","Data":"5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c"} Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.272421 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.273704 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86" event={"ID":"4b0be460-5699-4787-9c9e-90df6400faed","Type":"ContainerStarted","Data":"967078256ac9193fa1d832542effe48272709fdb43e54388aa1cc62d7d25f55e"} Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.275262 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9przj" event={"ID":"a3845481-effe-4cb2-9249-e9311df519a0","Type":"ContainerStarted","Data":"b408ff1c80292ac5017e44c6075c08c00a28d68c4ade1b10246a67d7709fca75"} Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.276771 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"23799dc8-9944-4c3d-a0e1-cf99f5cb7998","Type":"ContainerStarted","Data":"c82dbbe0eb6ca71336161015ea284573bf1cf53a6b5fb5824650267c1ab2d8a7"} Feb 01 07:07:31 crc kubenswrapper[5127]: I0201 07:07:31.298692 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" podStartSLOduration=2.2400467 podStartE2EDuration="17.298671622s" podCreationTimestamp="2026-02-01 07:07:14 +0000 UTC" firstStartedPulling="2026-02-01 07:07:14.832785539 +0000 UTC m=+1185.318687902" lastFinishedPulling="2026-02-01 07:07:29.891410461 +0000 UTC m=+1200.377312824" observedRunningTime="2026-02-01 07:07:31.290048448 +0000 UTC m=+1201.775950821" watchObservedRunningTime="2026-02-01 07:07:31.298671622 +0000 UTC m=+1201.784573985" Feb 01 07:07:32 crc kubenswrapper[5127]: I0201 07:07:32.244741 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249ffd47-8836-46ca-b80c-9fa90cceea62" path="/var/lib/kubelet/pods/249ffd47-8836-46ca-b80c-9fa90cceea62/volumes" Feb 01 07:07:32 crc kubenswrapper[5127]: I0201 07:07:32.245245 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c05dba-131a-4ed9-9895-67f80ab1e1f7" path="/var/lib/kubelet/pods/b0c05dba-131a-4ed9-9895-67f80ab1e1f7/volumes" Feb 01 07:07:32 crc kubenswrapper[5127]: I0201 07:07:32.286475 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"824fc658-1c02-4470-9ed3-e4123ddd7575","Type":"ContainerStarted","Data":"d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028"} Feb 01 07:07:36 crc kubenswrapper[5127]: I0201 07:07:36.741240 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:07:36 crc kubenswrapper[5127]: I0201 07:07:36.741917 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:07:37 crc kubenswrapper[5127]: I0201 07:07:37.324426 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" event={"ID":"90177717-535c-4840-9570-50d4f8363937","Type":"ContainerStarted","Data":"49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f"} Feb 01 07:07:37 crc kubenswrapper[5127]: I0201 07:07:37.324710 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:37 crc kubenswrapper[5127]: I0201 07:07:37.352840 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" podStartSLOduration=8.414456731 podStartE2EDuration="24.3528205s" podCreationTimestamp="2026-02-01 07:07:13 +0000 UTC" firstStartedPulling="2026-02-01 07:07:13.943911855 +0000 UTC m=+1184.429814218" lastFinishedPulling="2026-02-01 07:07:29.882275614 +0000 UTC m=+1200.368177987" observedRunningTime="2026-02-01 07:07:37.345388769 +0000 UTC m=+1207.831291132" watchObservedRunningTime="2026-02-01 07:07:37.3528205 +0000 UTC m=+1207.838722863" Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.339498 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d440b432-d2ce-4228-90b7-ad0c2e12ec86","Type":"ContainerStarted","Data":"8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.340638 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.344261 5127 generic.go:334] "Generic (PLEG): container finished" podID="a3845481-effe-4cb2-9249-e9311df519a0" containerID="ce93d2dfd29066e0859bd832cc0c9e0c839d3111b25ef6f8c4cfba58f6729a4a" exitCode=0 Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.344344 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9przj" event={"ID":"a3845481-effe-4cb2-9249-e9311df519a0","Type":"ContainerDied","Data":"ce93d2dfd29066e0859bd832cc0c9e0c839d3111b25ef6f8c4cfba58f6729a4a"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.346845 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02abfc06-bde0-4894-a5f8-f07207f1ba28","Type":"ContainerStarted","Data":"f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.362014 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.299878566 podStartE2EDuration="20.361997729s" podCreationTimestamp="2026-02-01 07:07:18 +0000 UTC" firstStartedPulling="2026-02-01 07:07:29.812962888 +0000 UTC m=+1200.298865261" lastFinishedPulling="2026-02-01 07:07:35.875082061 +0000 UTC m=+1206.360984424" observedRunningTime="2026-02-01 07:07:38.36060017 +0000 UTC m=+1208.846502543" watchObservedRunningTime="2026-02-01 07:07:38.361997729 +0000 UTC m=+1208.847900082" Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.362084 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"011ed99a-688f-4874-b6f7-f861080ef9d5","Type":"ContainerStarted","Data":"2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.374440 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"36a8a4ac-b308-4bb8-be43-dddca18b1bc1","Type":"ContainerStarted","Data":"b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.379573 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d523dcf2-c3fd-4473-ae9b-27e64a77205d","Type":"ContainerStarted","Data":"df507cd5ec54f237e6d768044e6d52556d60d036c40521c2cb898928ad478155"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.383930 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf","Type":"ContainerStarted","Data":"68db31784fbb766278c199e8e73106a300362c657897152ed214f52dcf6d04fa"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.384851 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.399155 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86" event={"ID":"4b0be460-5699-4787-9c9e-90df6400faed","Type":"ContainerStarted","Data":"86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349"} Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.406173 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hqn86" Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.454012 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hqn86" podStartSLOduration=8.549211812 podStartE2EDuration="14.453992248s" podCreationTimestamp="2026-02-01 07:07:24 +0000 UTC" firstStartedPulling="2026-02-01 07:07:30.423814698 +0000 UTC m=+1200.909717061" lastFinishedPulling="2026-02-01 07:07:36.328595134 +0000 UTC m=+1206.814497497" observedRunningTime="2026-02-01 07:07:38.441668644 +0000 UTC m=+1208.927571017" watchObservedRunningTime="2026-02-01 07:07:38.453992248 +0000 UTC m=+1208.939894601" Feb 01 07:07:38 crc kubenswrapper[5127]: I0201 07:07:38.466151 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.440045817 podStartE2EDuration="18.466133096s" podCreationTimestamp="2026-02-01 07:07:20 +0000 UTC" firstStartedPulling="2026-02-01 07:07:30.615019552 +0000 UTC m=+1201.100921915" lastFinishedPulling="2026-02-01 07:07:37.641106831 +0000 UTC m=+1208.127009194" observedRunningTime="2026-02-01 07:07:38.454249945 +0000 UTC m=+1208.940152298" watchObservedRunningTime="2026-02-01 07:07:38.466133096 +0000 UTC m=+1208.952035469" Feb 01 07:07:39 crc kubenswrapper[5127]: I0201 07:07:39.382868 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:07:39 crc kubenswrapper[5127]: I0201 07:07:39.423861 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9przj" event={"ID":"a3845481-effe-4cb2-9249-e9311df519a0","Type":"ContainerStarted","Data":"4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783"} Feb 01 07:07:39 crc kubenswrapper[5127]: I0201 07:07:39.472360 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-wqxk7"] Feb 01 07:07:39 crc kubenswrapper[5127]: I0201 07:07:39.472569 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" podUID="90177717-535c-4840-9570-50d4f8363937" containerName="dnsmasq-dns" containerID="cri-o://49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f" gracePeriod=10 Feb 01 07:07:39 crc kubenswrapper[5127]: I0201 07:07:39.867350 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.006408 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g4gj\" (UniqueName: \"kubernetes.io/projected/90177717-535c-4840-9570-50d4f8363937-kube-api-access-9g4gj\") pod \"90177717-535c-4840-9570-50d4f8363937\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.006484 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-config\") pod \"90177717-535c-4840-9570-50d4f8363937\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.006505 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-dns-svc\") pod \"90177717-535c-4840-9570-50d4f8363937\" (UID: \"90177717-535c-4840-9570-50d4f8363937\") " Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.011717 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90177717-535c-4840-9570-50d4f8363937-kube-api-access-9g4gj" (OuterVolumeSpecName: "kube-api-access-9g4gj") pod "90177717-535c-4840-9570-50d4f8363937" (UID: "90177717-535c-4840-9570-50d4f8363937"). InnerVolumeSpecName "kube-api-access-9g4gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.047084 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-config" (OuterVolumeSpecName: "config") pod "90177717-535c-4840-9570-50d4f8363937" (UID: "90177717-535c-4840-9570-50d4f8363937"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.056173 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90177717-535c-4840-9570-50d4f8363937" (UID: "90177717-535c-4840-9570-50d4f8363937"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.108092 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g4gj\" (UniqueName: \"kubernetes.io/projected/90177717-535c-4840-9570-50d4f8363937-kube-api-access-9g4gj\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.108310 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.108374 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90177717-535c-4840-9570-50d4f8363937-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.435321 5127 generic.go:334] "Generic (PLEG): container finished" podID="90177717-535c-4840-9570-50d4f8363937" containerID="49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f" exitCode=0 Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.435394 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.435424 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" event={"ID":"90177717-535c-4840-9570-50d4f8363937","Type":"ContainerDied","Data":"49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f"} Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.435883 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-wqxk7" event={"ID":"90177717-535c-4840-9570-50d4f8363937","Type":"ContainerDied","Data":"8eaeeb7bc61b30b06dbb903466b55acab4bc1f8ca1f082a6d2f93ecb1ca28d2f"} Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.435916 5127 scope.go:117] "RemoveContainer" containerID="49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.442607 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9przj" event={"ID":"a3845481-effe-4cb2-9249-e9311df519a0","Type":"ContainerStarted","Data":"3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea"} Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.442833 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.442926 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.444132 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"36a8a4ac-b308-4bb8-be43-dddca18b1bc1","Type":"ContainerStarted","Data":"00429248c74c1cbdec0c992d840fe52b3fb9bf53f5c7b39b33a5b2a1b7997c03"} Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.447560 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d523dcf2-c3fd-4473-ae9b-27e64a77205d","Type":"ContainerStarted","Data":"49ff01000f18ae004dcef08ba577c2e60ccd6f97ac2dd571eedb3934f8d4d73e"} Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.457040 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-wqxk7"] Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.472046 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-wqxk7"] Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.502192 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.610633293 podStartE2EDuration="14.502169812s" podCreationTimestamp="2026-02-01 07:07:26 +0000 UTC" firstStartedPulling="2026-02-01 07:07:30.777939051 +0000 UTC m=+1201.263841414" lastFinishedPulling="2026-02-01 07:07:39.66947555 +0000 UTC m=+1210.155377933" observedRunningTime="2026-02-01 07:07:40.50208956 +0000 UTC m=+1210.987991953" watchObservedRunningTime="2026-02-01 07:07:40.502169812 +0000 UTC m=+1210.988072175" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.510607 5127 scope.go:117] "RemoveContainer" containerID="0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.525547 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=7.591347822 podStartE2EDuration="16.525524554s" podCreationTimestamp="2026-02-01 07:07:24 +0000 UTC" firstStartedPulling="2026-02-01 07:07:30.747263251 +0000 UTC m=+1201.233165604" lastFinishedPulling="2026-02-01 07:07:39.681439963 +0000 UTC m=+1210.167342336" observedRunningTime="2026-02-01 07:07:40.522016679 +0000 UTC m=+1211.007919052" watchObservedRunningTime="2026-02-01 07:07:40.525524554 +0000 UTC m=+1211.011426917" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.544629 5127 scope.go:117] "RemoveContainer" containerID="49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f" Feb 01 07:07:40 crc kubenswrapper[5127]: E0201 07:07:40.546226 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f\": container with ID starting with 49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f not found: ID does not exist" containerID="49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.546281 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f"} err="failed to get container status \"49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f\": rpc error: code = NotFound desc = could not find container \"49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f\": container with ID starting with 49efb4734bd08b600dd6d831bdecd5d72e346a8d9cde7dde4f108c5fc396265f not found: ID does not exist" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.546314 5127 scope.go:117] "RemoveContainer" containerID="0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.551263 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9przj" podStartSLOduration=10.97962073 podStartE2EDuration="16.55124116s" podCreationTimestamp="2026-02-01 07:07:24 +0000 UTC" firstStartedPulling="2026-02-01 07:07:30.885661376 +0000 UTC m=+1201.371563739" lastFinishedPulling="2026-02-01 07:07:36.457281806 +0000 UTC m=+1206.943184169" observedRunningTime="2026-02-01 07:07:40.539701198 +0000 UTC m=+1211.025603601" watchObservedRunningTime="2026-02-01 07:07:40.55124116 +0000 UTC m=+1211.037143563" Feb 01 07:07:40 crc kubenswrapper[5127]: E0201 07:07:40.554439 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3\": container with ID starting with 0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3 not found: ID does not exist" containerID="0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.554491 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3"} err="failed to get container status \"0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3\": rpc error: code = NotFound desc = could not find container \"0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3\": container with ID starting with 0ac74d834e108d987748d12f2b0fee73188b837dca9a7a5aac83964bbf8bada3 not found: ID does not exist" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.568727 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.568790 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:40 crc kubenswrapper[5127]: I0201 07:07:40.609944 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:41 crc kubenswrapper[5127]: I0201 07:07:41.457887 5127 generic.go:334] "Generic (PLEG): container finished" podID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerID="f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607" exitCode=0 Feb 01 07:07:41 crc kubenswrapper[5127]: I0201 07:07:41.457970 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02abfc06-bde0-4894-a5f8-f07207f1ba28","Type":"ContainerDied","Data":"f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607"} Feb 01 07:07:41 crc kubenswrapper[5127]: I0201 07:07:41.461418 5127 generic.go:334] "Generic (PLEG): container finished" podID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerID="2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4" exitCode=0 Feb 01 07:07:41 crc kubenswrapper[5127]: I0201 07:07:41.461476 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"011ed99a-688f-4874-b6f7-f861080ef9d5","Type":"ContainerDied","Data":"2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4"} Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.250281 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90177717-535c-4840-9570-50d4f8363937" path="/var/lib/kubelet/pods/90177717-535c-4840-9570-50d4f8363937/volumes" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.473971 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02abfc06-bde0-4894-a5f8-f07207f1ba28","Type":"ContainerStarted","Data":"69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd"} Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.476464 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"011ed99a-688f-4874-b6f7-f861080ef9d5","Type":"ContainerStarted","Data":"eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8"} Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.512418 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.080945642 podStartE2EDuration="26.512392899s" podCreationTimestamp="2026-02-01 07:07:16 +0000 UTC" firstStartedPulling="2026-02-01 07:07:30.195205632 +0000 UTC m=+1200.681107995" lastFinishedPulling="2026-02-01 07:07:36.626652879 +0000 UTC m=+1207.112555252" observedRunningTime="2026-02-01 07:07:42.496156751 +0000 UTC m=+1212.982059164" watchObservedRunningTime="2026-02-01 07:07:42.512392899 +0000 UTC m=+1212.998295292" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.528431 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.561659 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.090372067 podStartE2EDuration="27.561633732s" podCreationTimestamp="2026-02-01 07:07:15 +0000 UTC" firstStartedPulling="2026-02-01 07:07:29.856969569 +0000 UTC m=+1200.342871952" lastFinishedPulling="2026-02-01 07:07:36.328231254 +0000 UTC m=+1206.814133617" observedRunningTime="2026-02-01 07:07:42.521762284 +0000 UTC m=+1213.007664647" watchObservedRunningTime="2026-02-01 07:07:42.561633732 +0000 UTC m=+1213.047536125" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.700627 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.700711 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.763185 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.802644 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kvtzt"] Feb 01 07:07:42 crc kubenswrapper[5127]: E0201 07:07:42.803072 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90177717-535c-4840-9570-50d4f8363937" containerName="init" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.803097 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="90177717-535c-4840-9570-50d4f8363937" containerName="init" Feb 01 07:07:42 crc kubenswrapper[5127]: E0201 07:07:42.803127 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90177717-535c-4840-9570-50d4f8363937" containerName="dnsmasq-dns" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.803136 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="90177717-535c-4840-9570-50d4f8363937" containerName="dnsmasq-dns" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.803329 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="90177717-535c-4840-9570-50d4f8363937" containerName="dnsmasq-dns" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.804315 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.811217 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.830399 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kvtzt"] Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.856854 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-config\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.856934 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-dns-svc\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.857147 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.857380 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w49n\" (UniqueName: \"kubernetes.io/projected/aba8f834-1626-4c41-b594-01e152eb2da8-kube-api-access-4w49n\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.934774 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6f5bs"] Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.936526 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.941393 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.953634 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6f5bs"] Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959241 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-combined-ca-bundle\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959292 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a9fd1d-985f-497f-9b8e-773013dc8747-config\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959325 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959347 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovs-rundir\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959364 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959427 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovn-rundir\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959444 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w49n\" (UniqueName: \"kubernetes.io/projected/aba8f834-1626-4c41-b594-01e152eb2da8-kube-api-access-4w49n\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-config\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959493 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-dns-svc\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.959517 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/24a9fd1d-985f-497f-9b8e-773013dc8747-kube-api-access-qtvtt\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.960457 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.961634 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-config\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:42 crc kubenswrapper[5127]: I0201 07:07:42.961724 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-dns-svc\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.007182 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w49n\" (UniqueName: \"kubernetes.io/projected/aba8f834-1626-4c41-b594-01e152eb2da8-kube-api-access-4w49n\") pod \"dnsmasq-dns-794868bd45-kvtzt\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.061274 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovn-rundir\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.061357 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/24a9fd1d-985f-497f-9b8e-773013dc8747-kube-api-access-qtvtt\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.061393 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-combined-ca-bundle\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.061429 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a9fd1d-985f-497f-9b8e-773013dc8747-config\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.061463 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovs-rundir\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.061489 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.061644 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovn-rundir\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.062070 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovs-rundir\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.062728 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a9fd1d-985f-497f-9b8e-773013dc8747-config\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.065147 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-combined-ca-bundle\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.082339 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.094426 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/24a9fd1d-985f-497f-9b8e-773013dc8747-kube-api-access-qtvtt\") pod \"ovn-controller-metrics-6f5bs\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.108719 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kvtzt"] Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.109286 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.155431 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-sqzbx"] Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.157082 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.160669 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.178982 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-sqzbx"] Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.250083 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.274552 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.274661 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.274725 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.274754 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlks\" (UniqueName: \"kubernetes.io/projected/0d65ea4e-02d4-490f-a424-f31f053ac7d6-kube-api-access-wqlks\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.274817 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-config\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.376218 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.376292 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlks\" (UniqueName: \"kubernetes.io/projected/0d65ea4e-02d4-490f-a424-f31f053ac7d6-kube-api-access-wqlks\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.376361 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-config\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.376425 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.376463 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.377436 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.377559 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.378204 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-config\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.378330 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.401079 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlks\" (UniqueName: \"kubernetes.io/projected/0d65ea4e-02d4-490f-a424-f31f053ac7d6-kube-api-access-wqlks\") pod \"dnsmasq-dns-757dc6fff9-sqzbx\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.520713 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.527277 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.583777 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.610597 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kvtzt"] Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.689486 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6f5bs"] Feb 01 07:07:43 crc kubenswrapper[5127]: W0201 07:07:43.701421 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a9fd1d_985f_497f_9b8e_773013dc8747.slice/crio-a6f3ae1340374b5cb40c86734020c01cdc48a6be1d2f748b1d07a133c4ed4260 WatchSource:0}: Error finding container a6f3ae1340374b5cb40c86734020c01cdc48a6be1d2f748b1d07a133c4ed4260: Status 404 returned error can't find the container with id a6f3ae1340374b5cb40c86734020c01cdc48a6be1d2f748b1d07a133c4ed4260 Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.808024 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.809756 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.811684 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.814465 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qjjn9" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.814737 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.814884 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.824311 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.883615 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gchqn\" (UniqueName: \"kubernetes.io/projected/6c50e0a2-f119-4a1a-911f-f7898cceddb8-kube-api-access-gchqn\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.883652 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.883698 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-config\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.883720 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.883792 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.883815 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.883844 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-scripts\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.985558 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gchqn\" (UniqueName: \"kubernetes.io/projected/6c50e0a2-f119-4a1a-911f-f7898cceddb8-kube-api-access-gchqn\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.985628 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.985668 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-config\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.985690 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.985757 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.985777 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.985796 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-scripts\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.986785 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.988428 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-config\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.992126 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-scripts\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:43 crc kubenswrapper[5127]: I0201 07:07:43.996367 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.011178 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.011755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gchqn\" (UniqueName: \"kubernetes.io/projected/6c50e0a2-f119-4a1a-911f-f7898cceddb8-kube-api-access-gchqn\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.015347 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " pod="openstack/ovn-northd-0" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.043900 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-sqzbx"] Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.136541 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.497834 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6f5bs" event={"ID":"24a9fd1d-985f-497f-9b8e-773013dc8747","Type":"ContainerStarted","Data":"a6f3ae1340374b5cb40c86734020c01cdc48a6be1d2f748b1d07a133c4ed4260"} Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.499496 5127 generic.go:334] "Generic (PLEG): container finished" podID="aba8f834-1626-4c41-b594-01e152eb2da8" containerID="b610a617a174c8372b448fa6635d589cfe376adad67428f060851d3e96b1f594" exitCode=0 Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.499531 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kvtzt" event={"ID":"aba8f834-1626-4c41-b594-01e152eb2da8","Type":"ContainerDied","Data":"b610a617a174c8372b448fa6635d589cfe376adad67428f060851d3e96b1f594"} Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.499560 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kvtzt" event={"ID":"aba8f834-1626-4c41-b594-01e152eb2da8","Type":"ContainerStarted","Data":"42c800830910cf622bb9dfd33a4e4be8763cb413fa36f8641b52cc17e0d54c5d"} Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.501412 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" event={"ID":"0d65ea4e-02d4-490f-a424-f31f053ac7d6","Type":"ContainerStarted","Data":"dc37d5604aa0aafdd1b6d0519a1e5ad7a3b5a0abee4e0cb096c02da953fb15ff"} Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.597581 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 07:07:44 crc kubenswrapper[5127]: W0201 07:07:44.626901 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c50e0a2_f119_4a1a_911f_f7898cceddb8.slice/crio-df9362d218c3814fe9e7cc536b2e53bb7d978a7e6c5187140d5d890e8d9c2acd WatchSource:0}: Error finding container df9362d218c3814fe9e7cc536b2e53bb7d978a7e6c5187140d5d890e8d9c2acd: Status 404 returned error can't find the container with id df9362d218c3814fe9e7cc536b2e53bb7d978a7e6c5187140d5d890e8d9c2acd Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.725860 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.903674 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-dns-svc\") pod \"aba8f834-1626-4c41-b594-01e152eb2da8\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.903758 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-config\") pod \"aba8f834-1626-4c41-b594-01e152eb2da8\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.903811 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w49n\" (UniqueName: \"kubernetes.io/projected/aba8f834-1626-4c41-b594-01e152eb2da8-kube-api-access-4w49n\") pod \"aba8f834-1626-4c41-b594-01e152eb2da8\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.903941 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-ovsdbserver-sb\") pod \"aba8f834-1626-4c41-b594-01e152eb2da8\" (UID: \"aba8f834-1626-4c41-b594-01e152eb2da8\") " Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.912718 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba8f834-1626-4c41-b594-01e152eb2da8-kube-api-access-4w49n" (OuterVolumeSpecName: "kube-api-access-4w49n") pod "aba8f834-1626-4c41-b594-01e152eb2da8" (UID: "aba8f834-1626-4c41-b594-01e152eb2da8"). InnerVolumeSpecName "kube-api-access-4w49n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.923264 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aba8f834-1626-4c41-b594-01e152eb2da8" (UID: "aba8f834-1626-4c41-b594-01e152eb2da8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.923324 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aba8f834-1626-4c41-b594-01e152eb2da8" (UID: "aba8f834-1626-4c41-b594-01e152eb2da8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:44 crc kubenswrapper[5127]: I0201 07:07:44.929883 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-config" (OuterVolumeSpecName: "config") pod "aba8f834-1626-4c41-b594-01e152eb2da8" (UID: "aba8f834-1626-4c41-b594-01e152eb2da8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.006406 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.006455 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.006476 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w49n\" (UniqueName: \"kubernetes.io/projected/aba8f834-1626-4c41-b594-01e152eb2da8-kube-api-access-4w49n\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.006495 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba8f834-1626-4c41-b594-01e152eb2da8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.513372 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c50e0a2-f119-4a1a-911f-f7898cceddb8","Type":"ContainerStarted","Data":"df9362d218c3814fe9e7cc536b2e53bb7d978a7e6c5187140d5d890e8d9c2acd"} Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.516361 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kvtzt" event={"ID":"aba8f834-1626-4c41-b594-01e152eb2da8","Type":"ContainerDied","Data":"42c800830910cf622bb9dfd33a4e4be8763cb413fa36f8641b52cc17e0d54c5d"} Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.516458 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kvtzt" Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.516770 5127 scope.go:117] "RemoveContainer" containerID="b610a617a174c8372b448fa6635d589cfe376adad67428f060851d3e96b1f594" Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.655454 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kvtzt"] Feb 01 07:07:45 crc kubenswrapper[5127]: I0201 07:07:45.668180 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kvtzt"] Feb 01 07:07:46 crc kubenswrapper[5127]: I0201 07:07:46.261164 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba8f834-1626-4c41-b594-01e152eb2da8" path="/var/lib/kubelet/pods/aba8f834-1626-4c41-b594-01e152eb2da8/volumes" Feb 01 07:07:46 crc kubenswrapper[5127]: I0201 07:07:46.882248 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 01 07:07:46 crc kubenswrapper[5127]: I0201 07:07:46.882670 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 01 07:07:48 crc kubenswrapper[5127]: I0201 07:07:48.259325 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:48 crc kubenswrapper[5127]: I0201 07:07:48.259403 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.592434 5127 generic.go:334] "Generic (PLEG): container finished" podID="0d65ea4e-02d4-490f-a424-f31f053ac7d6" containerID="be93b2bb4d0ba2371ac4ea39ed871b456647c5fa971fabab3a18e9e992ab8507" exitCode=0 Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.593646 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" event={"ID":"0d65ea4e-02d4-490f-a424-f31f053ac7d6","Type":"ContainerDied","Data":"be93b2bb4d0ba2371ac4ea39ed871b456647c5fa971fabab3a18e9e992ab8507"} Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.616062 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6f5bs" event={"ID":"24a9fd1d-985f-497f-9b8e-773013dc8747","Type":"ContainerStarted","Data":"fad044ef24a3873c346ee951546b90bf471b60d1b16cddbb5e20a468c5063b84"} Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.638655 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c50e0a2-f119-4a1a-911f-f7898cceddb8","Type":"ContainerStarted","Data":"0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625"} Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.640098 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.656792 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.750561 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.2278105 podStartE2EDuration="7.750525717s" podCreationTimestamp="2026-02-01 07:07:43 +0000 UTC" firstStartedPulling="2026-02-01 07:07:44.638533514 +0000 UTC m=+1215.124435877" lastFinishedPulling="2026-02-01 07:07:50.161248721 +0000 UTC m=+1220.647151094" observedRunningTime="2026-02-01 07:07:50.731600814 +0000 UTC m=+1221.217503187" watchObservedRunningTime="2026-02-01 07:07:50.750525717 +0000 UTC m=+1221.236428080" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.796223 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-sqzbx"] Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.864838 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6f5bs" podStartSLOduration=8.86481639 podStartE2EDuration="8.86481639s" podCreationTimestamp="2026-02-01 07:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:07:50.812974816 +0000 UTC m=+1221.298877189" watchObservedRunningTime="2026-02-01 07:07:50.86481639 +0000 UTC m=+1221.350718753" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.955870 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-l66pl"] Feb 01 07:07:50 crc kubenswrapper[5127]: E0201 07:07:50.956669 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba8f834-1626-4c41-b594-01e152eb2da8" containerName="init" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.956690 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba8f834-1626-4c41-b594-01e152eb2da8" containerName="init" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.956892 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba8f834-1626-4c41-b594-01e152eb2da8" containerName="init" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.957835 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:50 crc kubenswrapper[5127]: I0201 07:07:50.964280 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-l66pl"] Feb 01 07:07:51 crc kubenswrapper[5127]: E0201 07:07:51.033968 5127 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 01 07:07:51 crc kubenswrapper[5127]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/0d65ea4e-02d4-490f-a424-f31f053ac7d6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 01 07:07:51 crc kubenswrapper[5127]: > podSandboxID="dc37d5604aa0aafdd1b6d0519a1e5ad7a3b5a0abee4e0cb096c02da953fb15ff" Feb 01 07:07:51 crc kubenswrapper[5127]: E0201 07:07:51.034132 5127 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 01 07:07:51 crc kubenswrapper[5127]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n654h99h64ch5dbh6dh555h587h64bh5cfh647h5fdh57ch679h9h597h5f5hbch59bh54fh575h566h667h586h5f5h65ch5bch57h68h65ch58bh694h5cfq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wqlks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-757dc6fff9-sqzbx_openstack(0d65ea4e-02d4-490f-a424-f31f053ac7d6): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/0d65ea4e-02d4-490f-a424-f31f053ac7d6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 01 07:07:51 crc kubenswrapper[5127]: > logger="UnhandledError" Feb 01 07:07:51 crc kubenswrapper[5127]: E0201 07:07:51.035311 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/0d65ea4e-02d4-490f-a424-f31f053ac7d6/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" podUID="0d65ea4e-02d4-490f-a424-f31f053ac7d6" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.132752 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/e3ad894b-32c0-4283-839b-e29bf71b1381-kube-api-access-4h2zn\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.132826 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-config\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.133028 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.133065 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.133152 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.234824 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/e3ad894b-32c0-4283-839b-e29bf71b1381-kube-api-access-4h2zn\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.234919 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-config\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.234985 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.235007 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.235057 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.235954 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-config\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.236089 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.236677 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.236873 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.259867 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/e3ad894b-32c0-4283-839b-e29bf71b1381-kube-api-access-4h2zn\") pod \"dnsmasq-dns-6cb545bd4c-l66pl\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.293333 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.650632 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c50e0a2-f119-4a1a-911f-f7898cceddb8","Type":"ContainerStarted","Data":"44b08e72c489b008fa46527782b6bdc9a481d3a4439b530c26416808e1a4301f"} Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.789391 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-l66pl"] Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.876675 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.883577 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.889691 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.889941 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.890269 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-h9vjf" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.890464 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.915891 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 01 07:07:51 crc kubenswrapper[5127]: I0201 07:07:51.923570 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050197 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-nb\") pod \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050304 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-dns-svc\") pod \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050414 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqlks\" (UniqueName: \"kubernetes.io/projected/0d65ea4e-02d4-490f-a424-f31f053ac7d6-kube-api-access-wqlks\") pod \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050458 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-config\") pod \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050529 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-sb\") pod \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\" (UID: \"0d65ea4e-02d4-490f-a424-f31f053ac7d6\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050764 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050831 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmg5q\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-kube-api-access-xmg5q\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.050876 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.051018 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.051177 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-lock\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.051298 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-cache\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.057694 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d65ea4e-02d4-490f-a424-f31f053ac7d6-kube-api-access-wqlks" (OuterVolumeSpecName: "kube-api-access-wqlks") pod "0d65ea4e-02d4-490f-a424-f31f053ac7d6" (UID: "0d65ea4e-02d4-490f-a424-f31f053ac7d6"). InnerVolumeSpecName "kube-api-access-wqlks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.092278 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d65ea4e-02d4-490f-a424-f31f053ac7d6" (UID: "0d65ea4e-02d4-490f-a424-f31f053ac7d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.093657 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-config" (OuterVolumeSpecName: "config") pod "0d65ea4e-02d4-490f-a424-f31f053ac7d6" (UID: "0d65ea4e-02d4-490f-a424-f31f053ac7d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.100160 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d65ea4e-02d4-490f-a424-f31f053ac7d6" (UID: "0d65ea4e-02d4-490f-a424-f31f053ac7d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.112657 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d65ea4e-02d4-490f-a424-f31f053ac7d6" (UID: "0d65ea4e-02d4-490f-a424-f31f053ac7d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152572 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmg5q\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-kube-api-access-xmg5q\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152633 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152695 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152736 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-lock\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152765 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-cache\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152783 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152841 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqlks\" (UniqueName: \"kubernetes.io/projected/0d65ea4e-02d4-490f-a424-f31f053ac7d6-kube-api-access-wqlks\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152852 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152860 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152870 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.152879 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d65ea4e-02d4-490f-a424-f31f053ac7d6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.152855 5127 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.152903 5127 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.152953 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift podName:7e0ea2ea-fe04-40c8-87d0-1321996cbcba nodeName:}" failed. No retries permitted until 2026-02-01 07:07:52.652936547 +0000 UTC m=+1223.138838910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift") pod "swift-storage-0" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba") : configmap "swift-ring-files" not found Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.153234 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-lock\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.153416 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-cache\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.153519 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.158412 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.175191 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmg5q\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-kube-api-access-xmg5q\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.177692 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.475858 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-txd5n"] Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.476705 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d65ea4e-02d4-490f-a424-f31f053ac7d6" containerName="init" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.476728 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d65ea4e-02d4-490f-a424-f31f053ac7d6" containerName="init" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.476911 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d65ea4e-02d4-490f-a424-f31f053ac7d6" containerName="init" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.477515 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.479979 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.480037 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.487395 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.489847 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-txd5n"] Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.508693 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-txd5n"] Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.525744 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-47sg7"] Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.527555 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.550322 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-glfqb ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-txd5n" podUID="7b3a530e-8a38-44ff-80a4-a7dcd99add5e" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.559562 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-dispersionconf\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.559639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-scripts\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.559685 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-etc-swift\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.559717 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-ring-data-devices\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.559753 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-swiftconf\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.559776 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-combined-ca-bundle\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.559846 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glfqb\" (UniqueName: \"kubernetes.io/projected/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-kube-api-access-glfqb\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.560029 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-47sg7"] Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.656894 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" event={"ID":"0d65ea4e-02d4-490f-a424-f31f053ac7d6","Type":"ContainerDied","Data":"dc37d5604aa0aafdd1b6d0519a1e5ad7a3b5a0abee4e0cb096c02da953fb15ff"} Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.656920 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-sqzbx" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.656943 5127 scope.go:117] "RemoveContainer" containerID="be93b2bb4d0ba2371ac4ea39ed871b456647c5fa971fabab3a18e9e992ab8507" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661006 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-etc-swift\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661038 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-ring-data-devices\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661070 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-ring-data-devices\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661099 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqmtr\" (UniqueName: \"kubernetes.io/projected/379e85af-3108-4c83-88cb-a71948674382-kube-api-access-fqmtr\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661148 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-swiftconf\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661238 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-combined-ca-bundle\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661280 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/379e85af-3108-4c83-88cb-a71948674382-etc-swift\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661322 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-scripts\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661358 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-dispersionconf\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661390 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glfqb\" (UniqueName: \"kubernetes.io/projected/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-kube-api-access-glfqb\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661411 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-swiftconf\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661463 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-etc-swift\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661630 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661713 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-dispersionconf\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661738 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-combined-ca-bundle\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661761 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-scripts\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.661782 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-ring-data-devices\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.661972 5127 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.661994 5127 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 07:07:52 crc kubenswrapper[5127]: E0201 07:07:52.662046 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift podName:7e0ea2ea-fe04-40c8-87d0-1321996cbcba nodeName:}" failed. No retries permitted until 2026-02-01 07:07:53.662029942 +0000 UTC m=+1224.147932385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift") pod "swift-storage-0" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba") : configmap "swift-ring-files" not found Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.662520 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-scripts\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.664489 5127 generic.go:334] "Generic (PLEG): container finished" podID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerID="7a0917b4ceb20433e417aa83fd15f81d56a2a23d54172cd546aa05dc4a139d45" exitCode=0 Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.664905 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" event={"ID":"e3ad894b-32c0-4283-839b-e29bf71b1381","Type":"ContainerDied","Data":"7a0917b4ceb20433e417aa83fd15f81d56a2a23d54172cd546aa05dc4a139d45"} Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.665042 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" event={"ID":"e3ad894b-32c0-4283-839b-e29bf71b1381","Type":"ContainerStarted","Data":"f27c8b04c719e63b9c184fe5e268eb2b7a87188a0f82e75c6511c49d2d6fe91f"} Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.665193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-combined-ca-bundle\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.665230 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.666075 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-swiftconf\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.667889 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-dispersionconf\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.687398 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glfqb\" (UniqueName: \"kubernetes.io/projected/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-kube-api-access-glfqb\") pod \"swift-ring-rebalance-txd5n\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.710469 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-sqzbx"] Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.718952 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-sqzbx"] Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.772704 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-combined-ca-bundle\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.772829 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-ring-data-devices\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.772870 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqmtr\" (UniqueName: \"kubernetes.io/projected/379e85af-3108-4c83-88cb-a71948674382-kube-api-access-fqmtr\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.772897 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/379e85af-3108-4c83-88cb-a71948674382-etc-swift\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.772960 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-scripts\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.773023 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-dispersionconf\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.773089 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-swiftconf\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.775189 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-ring-data-devices\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.775689 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/379e85af-3108-4c83-88cb-a71948674382-etc-swift\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.776652 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-scripts\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.779965 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-dispersionconf\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.780361 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-combined-ca-bundle\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.787575 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.793657 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-swiftconf\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.797218 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqmtr\" (UniqueName: \"kubernetes.io/projected/379e85af-3108-4c83-88cb-a71948674382-kube-api-access-fqmtr\") pod \"swift-ring-rebalance-47sg7\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.860260 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.976503 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glfqb\" (UniqueName: \"kubernetes.io/projected/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-kube-api-access-glfqb\") pod \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.976861 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-etc-swift\") pod \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.976889 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-combined-ca-bundle\") pod \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.976964 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-dispersionconf\") pod \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.976993 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-swiftconf\") pod \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.977027 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-scripts\") pod \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.977071 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-ring-data-devices\") pod \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\" (UID: \"7b3a530e-8a38-44ff-80a4-a7dcd99add5e\") " Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.977862 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7b3a530e-8a38-44ff-80a4-a7dcd99add5e" (UID: "7b3a530e-8a38-44ff-80a4-a7dcd99add5e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.978099 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7b3a530e-8a38-44ff-80a4-a7dcd99add5e" (UID: "7b3a530e-8a38-44ff-80a4-a7dcd99add5e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.981518 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-kube-api-access-glfqb" (OuterVolumeSpecName: "kube-api-access-glfqb") pod "7b3a530e-8a38-44ff-80a4-a7dcd99add5e" (UID: "7b3a530e-8a38-44ff-80a4-a7dcd99add5e"). InnerVolumeSpecName "kube-api-access-glfqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.981782 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-scripts" (OuterVolumeSpecName: "scripts") pod "7b3a530e-8a38-44ff-80a4-a7dcd99add5e" (UID: "7b3a530e-8a38-44ff-80a4-a7dcd99add5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.982230 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7b3a530e-8a38-44ff-80a4-a7dcd99add5e" (UID: "7b3a530e-8a38-44ff-80a4-a7dcd99add5e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.989829 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b3a530e-8a38-44ff-80a4-a7dcd99add5e" (UID: "7b3a530e-8a38-44ff-80a4-a7dcd99add5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:07:52 crc kubenswrapper[5127]: I0201 07:07:52.991204 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7b3a530e-8a38-44ff-80a4-a7dcd99add5e" (UID: "7b3a530e-8a38-44ff-80a4-a7dcd99add5e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.078813 5127 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.078861 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.078876 5127 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.078889 5127 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.078904 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.078917 5127 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.078932 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glfqb\" (UniqueName: \"kubernetes.io/projected/7b3a530e-8a38-44ff-80a4-a7dcd99add5e-kube-api-access-glfqb\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.295186 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-47sg7"] Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.415953 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.494819 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.673328 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" event={"ID":"e3ad894b-32c0-4283-839b-e29bf71b1381","Type":"ContainerStarted","Data":"b9ff0e88c0e37b2e9fadda6a2bad473d2a35e03ce64018a8297df33d9e81fbbe"} Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.673786 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.675325 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-47sg7" event={"ID":"379e85af-3108-4c83-88cb-a71948674382","Type":"ContainerStarted","Data":"652f2804146360ea2f589553c7139013d47a40efd238c2b78ec9967555a30f9a"} Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.677247 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-txd5n" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.692720 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:53 crc kubenswrapper[5127]: E0201 07:07:53.693105 5127 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 07:07:53 crc kubenswrapper[5127]: E0201 07:07:53.693124 5127 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 07:07:53 crc kubenswrapper[5127]: E0201 07:07:53.693173 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift podName:7e0ea2ea-fe04-40c8-87d0-1321996cbcba nodeName:}" failed. No retries permitted until 2026-02-01 07:07:55.693157476 +0000 UTC m=+1226.179059839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift") pod "swift-storage-0" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba") : configmap "swift-ring-files" not found Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.707777 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" podStartSLOduration=3.707757471 podStartE2EDuration="3.707757471s" podCreationTimestamp="2026-02-01 07:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:07:53.704279947 +0000 UTC m=+1224.190182310" watchObservedRunningTime="2026-02-01 07:07:53.707757471 +0000 UTC m=+1224.193659834" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.747595 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-txd5n"] Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.759036 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-txd5n"] Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.826985 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-g4264"] Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.828406 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4264" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.849188 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-g4264"] Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.896025 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bfb0-account-create-update-mjdkv"] Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.897200 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.900058 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 01 07:07:53 crc kubenswrapper[5127]: I0201 07:07:53.917157 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bfb0-account-create-update-mjdkv"] Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:53.999564 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-operator-scripts\") pod \"glance-db-create-g4264\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " pod="openstack/glance-db-create-g4264" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:53.999745 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbl2\" (UniqueName: \"kubernetes.io/projected/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-kube-api-access-cqbl2\") pod \"glance-db-create-g4264\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " pod="openstack/glance-db-create-g4264" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:53.999846 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2w72\" (UniqueName: \"kubernetes.io/projected/64ad9d98-24ab-40fe-ac49-63b423cd33de-kube-api-access-j2w72\") pod \"glance-bfb0-account-create-update-mjdkv\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:53.999992 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ad9d98-24ab-40fe-ac49-63b423cd33de-operator-scripts\") pod \"glance-bfb0-account-create-update-mjdkv\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.102091 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2w72\" (UniqueName: \"kubernetes.io/projected/64ad9d98-24ab-40fe-ac49-63b423cd33de-kube-api-access-j2w72\") pod \"glance-bfb0-account-create-update-mjdkv\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.102875 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ad9d98-24ab-40fe-ac49-63b423cd33de-operator-scripts\") pod \"glance-bfb0-account-create-update-mjdkv\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.102924 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-operator-scripts\") pod \"glance-db-create-g4264\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " pod="openstack/glance-db-create-g4264" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.103029 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbl2\" (UniqueName: \"kubernetes.io/projected/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-kube-api-access-cqbl2\") pod \"glance-db-create-g4264\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " pod="openstack/glance-db-create-g4264" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.104116 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ad9d98-24ab-40fe-ac49-63b423cd33de-operator-scripts\") pod \"glance-bfb0-account-create-update-mjdkv\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.104342 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-operator-scripts\") pod \"glance-db-create-g4264\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " pod="openstack/glance-db-create-g4264" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.132688 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2w72\" (UniqueName: \"kubernetes.io/projected/64ad9d98-24ab-40fe-ac49-63b423cd33de-kube-api-access-j2w72\") pod \"glance-bfb0-account-create-update-mjdkv\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.135132 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbl2\" (UniqueName: \"kubernetes.io/projected/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-kube-api-access-cqbl2\") pod \"glance-db-create-g4264\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " pod="openstack/glance-db-create-g4264" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.151344 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4264" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.214496 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.259535 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d65ea4e-02d4-490f-a424-f31f053ac7d6" path="/var/lib/kubelet/pods/0d65ea4e-02d4-490f-a424-f31f053ac7d6/volumes" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.260558 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3a530e-8a38-44ff-80a4-a7dcd99add5e" path="/var/lib/kubelet/pods/7b3a530e-8a38-44ff-80a4-a7dcd99add5e/volumes" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.403776 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.475329 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.754699 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-g4264"] Feb 01 07:07:54 crc kubenswrapper[5127]: W0201 07:07:54.767405 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb0ca89_32ae_4796_8b4e_b9ac35cfaafd.slice/crio-55aa52422a4b13275d16692b5ca687972e454e2a1c989d491d37d8849667354b WatchSource:0}: Error finding container 55aa52422a4b13275d16692b5ca687972e454e2a1c989d491d37d8849667354b: Status 404 returned error can't find the container with id 55aa52422a4b13275d16692b5ca687972e454e2a1c989d491d37d8849667354b Feb 01 07:07:54 crc kubenswrapper[5127]: I0201 07:07:54.799654 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bfb0-account-create-update-mjdkv"] Feb 01 07:07:54 crc kubenswrapper[5127]: W0201 07:07:54.799678 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ad9d98_24ab_40fe_ac49_63b423cd33de.slice/crio-3a5ce47c959b4979da467d9be65fdd28cfbe626f854b25fdeb04342853b64a68 WatchSource:0}: Error finding container 3a5ce47c959b4979da467d9be65fdd28cfbe626f854b25fdeb04342853b64a68: Status 404 returned error can't find the container with id 3a5ce47c959b4979da467d9be65fdd28cfbe626f854b25fdeb04342853b64a68 Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.508638 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k9rgk"] Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.511184 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.513148 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.517400 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k9rgk"] Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.636004 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d3831b-3f13-441c-b809-40428cfd7b4b-operator-scripts\") pod \"root-account-create-update-k9rgk\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.636216 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlqc2\" (UniqueName: \"kubernetes.io/projected/72d3831b-3f13-441c-b809-40428cfd7b4b-kube-api-access-tlqc2\") pod \"root-account-create-update-k9rgk\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.739550 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d3831b-3f13-441c-b809-40428cfd7b4b-operator-scripts\") pod \"root-account-create-update-k9rgk\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.739905 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.740009 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlqc2\" (UniqueName: \"kubernetes.io/projected/72d3831b-3f13-441c-b809-40428cfd7b4b-kube-api-access-tlqc2\") pod \"root-account-create-update-k9rgk\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:55 crc kubenswrapper[5127]: E0201 07:07:55.740460 5127 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 07:07:55 crc kubenswrapper[5127]: E0201 07:07:55.740477 5127 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 07:07:55 crc kubenswrapper[5127]: E0201 07:07:55.740525 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift podName:7e0ea2ea-fe04-40c8-87d0-1321996cbcba nodeName:}" failed. No retries permitted until 2026-02-01 07:07:59.740508027 +0000 UTC m=+1230.226410400 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift") pod "swift-storage-0" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba") : configmap "swift-ring-files" not found Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.740842 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d3831b-3f13-441c-b809-40428cfd7b4b-operator-scripts\") pod \"root-account-create-update-k9rgk\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.744849 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g4264" event={"ID":"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd","Type":"ContainerStarted","Data":"55aa52422a4b13275d16692b5ca687972e454e2a1c989d491d37d8849667354b"} Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.746065 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bfb0-account-create-update-mjdkv" event={"ID":"64ad9d98-24ab-40fe-ac49-63b423cd33de","Type":"ContainerStarted","Data":"3a5ce47c959b4979da467d9be65fdd28cfbe626f854b25fdeb04342853b64a68"} Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.801334 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlqc2\" (UniqueName: \"kubernetes.io/projected/72d3831b-3f13-441c-b809-40428cfd7b4b-kube-api-access-tlqc2\") pod \"root-account-create-update-k9rgk\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:55 crc kubenswrapper[5127]: I0201 07:07:55.838406 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k9rgk" Feb 01 07:07:56 crc kubenswrapper[5127]: I0201 07:07:56.761544 5127 generic.go:334] "Generic (PLEG): container finished" podID="feb0ca89-32ae-4796-8b4e-b9ac35cfaafd" containerID="2bd659b931ab10a61b286761cd4b38488cdd2ed33afa200b2a1085ee2b5b0190" exitCode=0 Feb 01 07:07:56 crc kubenswrapper[5127]: I0201 07:07:56.761606 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g4264" event={"ID":"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd","Type":"ContainerDied","Data":"2bd659b931ab10a61b286761cd4b38488cdd2ed33afa200b2a1085ee2b5b0190"} Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.392619 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k9rgk"] Feb 01 07:07:57 crc kubenswrapper[5127]: W0201 07:07:57.402718 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d3831b_3f13_441c_b809_40428cfd7b4b.slice/crio-0115a7a5caf2f075f896ecf5282134315143a81ee6f6612b966063ce5ef6cb94 WatchSource:0}: Error finding container 0115a7a5caf2f075f896ecf5282134315143a81ee6f6612b966063ce5ef6cb94: Status 404 returned error can't find the container with id 0115a7a5caf2f075f896ecf5282134315143a81ee6f6612b966063ce5ef6cb94 Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.770670 5127 generic.go:334] "Generic (PLEG): container finished" podID="64ad9d98-24ab-40fe-ac49-63b423cd33de" containerID="d161c03d918bf5904d9550124bdb130cd830048ad05dc7bf70edc02f5386bc0c" exitCode=0 Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.771188 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bfb0-account-create-update-mjdkv" event={"ID":"64ad9d98-24ab-40fe-ac49-63b423cd33de","Type":"ContainerDied","Data":"d161c03d918bf5904d9550124bdb130cd830048ad05dc7bf70edc02f5386bc0c"} Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.775099 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-47sg7" event={"ID":"379e85af-3108-4c83-88cb-a71948674382","Type":"ContainerStarted","Data":"9736798c00ea577ff511a799e202c624f7065f1a456d8c24e360b90f890a6de7"} Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.778365 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k9rgk" event={"ID":"72d3831b-3f13-441c-b809-40428cfd7b4b","Type":"ContainerStarted","Data":"88291eba0b8528cf903ca664c25bb565e51ca8aefe36f5d3a39b3e4c18656b90"} Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.778416 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k9rgk" event={"ID":"72d3831b-3f13-441c-b809-40428cfd7b4b","Type":"ContainerStarted","Data":"0115a7a5caf2f075f896ecf5282134315143a81ee6f6612b966063ce5ef6cb94"} Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.829294 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-47sg7" podStartSLOduration=2.177647036 podStartE2EDuration="5.82927108s" podCreationTimestamp="2026-02-01 07:07:52 +0000 UTC" firstStartedPulling="2026-02-01 07:07:53.304412586 +0000 UTC m=+1223.790314949" lastFinishedPulling="2026-02-01 07:07:56.95603663 +0000 UTC m=+1227.441938993" observedRunningTime="2026-02-01 07:07:57.815193079 +0000 UTC m=+1228.301095452" watchObservedRunningTime="2026-02-01 07:07:57.82927108 +0000 UTC m=+1228.315173473" Feb 01 07:07:57 crc kubenswrapper[5127]: I0201 07:07:57.842949 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-k9rgk" podStartSLOduration=2.84292566 podStartE2EDuration="2.84292566s" podCreationTimestamp="2026-02-01 07:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:07:57.828238283 +0000 UTC m=+1228.314140656" watchObservedRunningTime="2026-02-01 07:07:57.84292566 +0000 UTC m=+1228.328828033" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.132630 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4264" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.186208 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-operator-scripts\") pod \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.186625 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqbl2\" (UniqueName: \"kubernetes.io/projected/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-kube-api-access-cqbl2\") pod \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\" (UID: \"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd\") " Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.188238 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "feb0ca89-32ae-4796-8b4e-b9ac35cfaafd" (UID: "feb0ca89-32ae-4796-8b4e-b9ac35cfaafd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.192947 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-kube-api-access-cqbl2" (OuterVolumeSpecName: "kube-api-access-cqbl2") pod "feb0ca89-32ae-4796-8b4e-b9ac35cfaafd" (UID: "feb0ca89-32ae-4796-8b4e-b9ac35cfaafd"). InnerVolumeSpecName "kube-api-access-cqbl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.226887 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-f4qps"] Feb 01 07:07:58 crc kubenswrapper[5127]: E0201 07:07:58.227343 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb0ca89-32ae-4796-8b4e-b9ac35cfaafd" containerName="mariadb-database-create" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.227359 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb0ca89-32ae-4796-8b4e-b9ac35cfaafd" containerName="mariadb-database-create" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.227639 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb0ca89-32ae-4796-8b4e-b9ac35cfaafd" containerName="mariadb-database-create" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.228334 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.261998 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f4qps"] Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.288626 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rgpd\" (UniqueName: \"kubernetes.io/projected/8b9f1eb0-ad17-4bd0-b554-bff78a522559-kube-api-access-5rgpd\") pod \"keystone-db-create-f4qps\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.289021 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f1eb0-ad17-4bd0-b554-bff78a522559-operator-scripts\") pod \"keystone-db-create-f4qps\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.289191 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.289291 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqbl2\" (UniqueName: \"kubernetes.io/projected/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd-kube-api-access-cqbl2\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.338596 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9b2b-account-create-update-s4768"] Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.341127 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.347322 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.353971 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b2b-account-create-update-s4768"] Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.394888 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7r8g\" (UniqueName: \"kubernetes.io/projected/d57f121c-42a8-4515-9b9f-f540a3a78b79-kube-api-access-w7r8g\") pod \"keystone-9b2b-account-create-update-s4768\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.394960 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57f121c-42a8-4515-9b9f-f540a3a78b79-operator-scripts\") pod \"keystone-9b2b-account-create-update-s4768\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.395027 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rgpd\" (UniqueName: \"kubernetes.io/projected/8b9f1eb0-ad17-4bd0-b554-bff78a522559-kube-api-access-5rgpd\") pod \"keystone-db-create-f4qps\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.395092 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f1eb0-ad17-4bd0-b554-bff78a522559-operator-scripts\") pod \"keystone-db-create-f4qps\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.396201 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f1eb0-ad17-4bd0-b554-bff78a522559-operator-scripts\") pod \"keystone-db-create-f4qps\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.418905 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rgpd\" (UniqueName: \"kubernetes.io/projected/8b9f1eb0-ad17-4bd0-b554-bff78a522559-kube-api-access-5rgpd\") pod \"keystone-db-create-f4qps\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.496691 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7r8g\" (UniqueName: \"kubernetes.io/projected/d57f121c-42a8-4515-9b9f-f540a3a78b79-kube-api-access-w7r8g\") pod \"keystone-9b2b-account-create-update-s4768\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.501554 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57f121c-42a8-4515-9b9f-f540a3a78b79-operator-scripts\") pod \"keystone-9b2b-account-create-update-s4768\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.502274 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57f121c-42a8-4515-9b9f-f540a3a78b79-operator-scripts\") pod \"keystone-9b2b-account-create-update-s4768\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.515657 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7r8g\" (UniqueName: \"kubernetes.io/projected/d57f121c-42a8-4515-9b9f-f540a3a78b79-kube-api-access-w7r8g\") pod \"keystone-9b2b-account-create-update-s4768\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.536643 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-grk2z"] Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.538114 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.551114 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-grk2z"] Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.561498 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f4qps" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.604416 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2ws\" (UniqueName: \"kubernetes.io/projected/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-kube-api-access-kn2ws\") pod \"placement-db-create-grk2z\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.604735 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-operator-scripts\") pod \"placement-db-create-grk2z\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.656655 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ad9f-account-create-update-mv52x"] Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.657654 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.664907 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.669447 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ad9f-account-create-update-mv52x"] Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.678563 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.706294 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2ws\" (UniqueName: \"kubernetes.io/projected/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-kube-api-access-kn2ws\") pod \"placement-db-create-grk2z\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.706365 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-operator-scripts\") pod \"placement-db-create-grk2z\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.706412 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5757\" (UniqueName: \"kubernetes.io/projected/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-kube-api-access-f5757\") pod \"placement-ad9f-account-create-update-mv52x\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.706439 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-operator-scripts\") pod \"placement-ad9f-account-create-update-mv52x\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.714355 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-operator-scripts\") pod \"placement-db-create-grk2z\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.733638 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2ws\" (UniqueName: \"kubernetes.io/projected/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-kube-api-access-kn2ws\") pod \"placement-db-create-grk2z\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.808145 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-operator-scripts\") pod \"placement-ad9f-account-create-update-mv52x\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.808381 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5757\" (UniqueName: \"kubernetes.io/projected/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-kube-api-access-f5757\") pod \"placement-ad9f-account-create-update-mv52x\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.809966 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-g4264" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.810067 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-g4264" event={"ID":"feb0ca89-32ae-4796-8b4e-b9ac35cfaafd","Type":"ContainerDied","Data":"55aa52422a4b13275d16692b5ca687972e454e2a1c989d491d37d8849667354b"} Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.810144 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55aa52422a4b13275d16692b5ca687972e454e2a1c989d491d37d8849667354b" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.812390 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-operator-scripts\") pod \"placement-ad9f-account-create-update-mv52x\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.820825 5127 generic.go:334] "Generic (PLEG): container finished" podID="72d3831b-3f13-441c-b809-40428cfd7b4b" containerID="88291eba0b8528cf903ca664c25bb565e51ca8aefe36f5d3a39b3e4c18656b90" exitCode=0 Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.820998 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k9rgk" event={"ID":"72d3831b-3f13-441c-b809-40428cfd7b4b","Type":"ContainerDied","Data":"88291eba0b8528cf903ca664c25bb565e51ca8aefe36f5d3a39b3e4c18656b90"} Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.826138 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5757\" (UniqueName: \"kubernetes.io/projected/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-kube-api-access-f5757\") pod \"placement-ad9f-account-create-update-mv52x\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.860982 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-grk2z" Feb 01 07:07:58 crc kubenswrapper[5127]: I0201 07:07:58.985167 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.295516 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.305177 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-f4qps"] Feb 01 07:07:59 crc kubenswrapper[5127]: W0201 07:07:59.393917 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd57f121c_42a8_4515_9b9f_f540a3a78b79.slice/crio-d050c12a178d17eb54c5d1c1ad925c009d57f2ded9de70c7e1625f095136ec7b WatchSource:0}: Error finding container d050c12a178d17eb54c5d1c1ad925c009d57f2ded9de70c7e1625f095136ec7b: Status 404 returned error can't find the container with id d050c12a178d17eb54c5d1c1ad925c009d57f2ded9de70c7e1625f095136ec7b Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.394379 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b2b-account-create-update-s4768"] Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.430656 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2w72\" (UniqueName: \"kubernetes.io/projected/64ad9d98-24ab-40fe-ac49-63b423cd33de-kube-api-access-j2w72\") pod \"64ad9d98-24ab-40fe-ac49-63b423cd33de\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.430757 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ad9d98-24ab-40fe-ac49-63b423cd33de-operator-scripts\") pod \"64ad9d98-24ab-40fe-ac49-63b423cd33de\" (UID: \"64ad9d98-24ab-40fe-ac49-63b423cd33de\") " Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.431355 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ad9d98-24ab-40fe-ac49-63b423cd33de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64ad9d98-24ab-40fe-ac49-63b423cd33de" (UID: "64ad9d98-24ab-40fe-ac49-63b423cd33de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.437702 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ad9d98-24ab-40fe-ac49-63b423cd33de-kube-api-access-j2w72" (OuterVolumeSpecName: "kube-api-access-j2w72") pod "64ad9d98-24ab-40fe-ac49-63b423cd33de" (UID: "64ad9d98-24ab-40fe-ac49-63b423cd33de"). InnerVolumeSpecName "kube-api-access-j2w72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.487204 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ad9f-account-create-update-mv52x"] Feb 01 07:07:59 crc kubenswrapper[5127]: W0201 07:07:59.496642 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4fae6b8_8d43_4df5_b5c8_4482bf865a73.slice/crio-04cc1113a8c5817278fb3fc478d345dc6106d1c096bba7e8c1b2041c78c5d583 WatchSource:0}: Error finding container 04cc1113a8c5817278fb3fc478d345dc6106d1c096bba7e8c1b2041c78c5d583: Status 404 returned error can't find the container with id 04cc1113a8c5817278fb3fc478d345dc6106d1c096bba7e8c1b2041c78c5d583 Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.497224 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-grk2z"] Feb 01 07:07:59 crc kubenswrapper[5127]: W0201 07:07:59.498466 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9ce0d4c_bd50_4466_83fb_68bea7c4ed61.slice/crio-82ecfbeff12ad61ded679175e53117d962d40152751f7d15bc2f3e6c38faffde WatchSource:0}: Error finding container 82ecfbeff12ad61ded679175e53117d962d40152751f7d15bc2f3e6c38faffde: Status 404 returned error can't find the container with id 82ecfbeff12ad61ded679175e53117d962d40152751f7d15bc2f3e6c38faffde Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.532351 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ad9d98-24ab-40fe-ac49-63b423cd33de-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.532382 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2w72\" (UniqueName: \"kubernetes.io/projected/64ad9d98-24ab-40fe-ac49-63b423cd33de-kube-api-access-j2w72\") on node \"crc\" DevicePath \"\"" Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.833118 5127 generic.go:334] "Generic (PLEG): container finished" podID="8b9f1eb0-ad17-4bd0-b554-bff78a522559" containerID="b45bb4738590ea1089edb04db72919b56e3e92a86966c7ea321a6b8125920f90" exitCode=0 Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.833179 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f4qps" event={"ID":"8b9f1eb0-ad17-4bd0-b554-bff78a522559","Type":"ContainerDied","Data":"b45bb4738590ea1089edb04db72919b56e3e92a86966c7ea321a6b8125920f90"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.833209 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f4qps" event={"ID":"8b9f1eb0-ad17-4bd0-b554-bff78a522559","Type":"ContainerStarted","Data":"5770e34f4af42f32f964e84b1ca1541ee4b6b63d4dbc16cb4b1e76b284e94364"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.835618 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:07:59 crc kubenswrapper[5127]: E0201 07:07:59.835772 5127 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 01 07:07:59 crc kubenswrapper[5127]: E0201 07:07:59.835792 5127 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 01 07:07:59 crc kubenswrapper[5127]: E0201 07:07:59.835837 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift podName:7e0ea2ea-fe04-40c8-87d0-1321996cbcba nodeName:}" failed. No retries permitted until 2026-02-01 07:08:07.835821338 +0000 UTC m=+1238.321723701 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift") pod "swift-storage-0" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba") : configmap "swift-ring-files" not found Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.836338 5127 generic.go:334] "Generic (PLEG): container finished" podID="e9ce0d4c-bd50-4466-83fb-68bea7c4ed61" containerID="7b50a2b6aa5d8e197f639de042061583730b740728b0fe70233696ba7d2e113e" exitCode=0 Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.836430 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-grk2z" event={"ID":"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61","Type":"ContainerDied","Data":"7b50a2b6aa5d8e197f639de042061583730b740728b0fe70233696ba7d2e113e"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.836460 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-grk2z" event={"ID":"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61","Type":"ContainerStarted","Data":"82ecfbeff12ad61ded679175e53117d962d40152751f7d15bc2f3e6c38faffde"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.837836 5127 generic.go:334] "Generic (PLEG): container finished" podID="b4fae6b8-8d43-4df5-b5c8-4482bf865a73" containerID="a9bda3ed30a3b102ae5aafe7c3aa829f283282c23e2e72306ed45823288d360e" exitCode=0 Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.837887 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ad9f-account-create-update-mv52x" event={"ID":"b4fae6b8-8d43-4df5-b5c8-4482bf865a73","Type":"ContainerDied","Data":"a9bda3ed30a3b102ae5aafe7c3aa829f283282c23e2e72306ed45823288d360e"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.837907 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ad9f-account-create-update-mv52x" event={"ID":"b4fae6b8-8d43-4df5-b5c8-4482bf865a73","Type":"ContainerStarted","Data":"04cc1113a8c5817278fb3fc478d345dc6106d1c096bba7e8c1b2041c78c5d583"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.839292 5127 generic.go:334] "Generic (PLEG): container finished" podID="d57f121c-42a8-4515-9b9f-f540a3a78b79" containerID="2245f9d4930b5b4a91b37b1a99574bd64676e417ddb0ffd544851354e99a25a4" exitCode=0 Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.839337 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b2b-account-create-update-s4768" event={"ID":"d57f121c-42a8-4515-9b9f-f540a3a78b79","Type":"ContainerDied","Data":"2245f9d4930b5b4a91b37b1a99574bd64676e417ddb0ffd544851354e99a25a4"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.839355 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b2b-account-create-update-s4768" event={"ID":"d57f121c-42a8-4515-9b9f-f540a3a78b79","Type":"ContainerStarted","Data":"d050c12a178d17eb54c5d1c1ad925c009d57f2ded9de70c7e1625f095136ec7b"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.842015 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bfb0-account-create-update-mjdkv" Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.842020 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bfb0-account-create-update-mjdkv" event={"ID":"64ad9d98-24ab-40fe-ac49-63b423cd33de","Type":"ContainerDied","Data":"3a5ce47c959b4979da467d9be65fdd28cfbe626f854b25fdeb04342853b64a68"} Feb 01 07:07:59 crc kubenswrapper[5127]: I0201 07:07:59.842050 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a5ce47c959b4979da467d9be65fdd28cfbe626f854b25fdeb04342853b64a68" Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.173022 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k9rgk" Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.254760 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlqc2\" (UniqueName: \"kubernetes.io/projected/72d3831b-3f13-441c-b809-40428cfd7b4b-kube-api-access-tlqc2\") pod \"72d3831b-3f13-441c-b809-40428cfd7b4b\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.254948 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d3831b-3f13-441c-b809-40428cfd7b4b-operator-scripts\") pod \"72d3831b-3f13-441c-b809-40428cfd7b4b\" (UID: \"72d3831b-3f13-441c-b809-40428cfd7b4b\") " Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.256240 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72d3831b-3f13-441c-b809-40428cfd7b4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72d3831b-3f13-441c-b809-40428cfd7b4b" (UID: "72d3831b-3f13-441c-b809-40428cfd7b4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.261762 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d3831b-3f13-441c-b809-40428cfd7b4b-kube-api-access-tlqc2" (OuterVolumeSpecName: "kube-api-access-tlqc2") pod "72d3831b-3f13-441c-b809-40428cfd7b4b" (UID: "72d3831b-3f13-441c-b809-40428cfd7b4b"). InnerVolumeSpecName "kube-api-access-tlqc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.356340 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlqc2\" (UniqueName: \"kubernetes.io/projected/72d3831b-3f13-441c-b809-40428cfd7b4b-kube-api-access-tlqc2\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.356376 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d3831b-3f13-441c-b809-40428cfd7b4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.860553 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k9rgk" Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.860982 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k9rgk" event={"ID":"72d3831b-3f13-441c-b809-40428cfd7b4b","Type":"ContainerDied","Data":"0115a7a5caf2f075f896ecf5282134315143a81ee6f6612b966063ce5ef6cb94"} Feb 01 07:08:00 crc kubenswrapper[5127]: I0201 07:08:00.861020 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0115a7a5caf2f075f896ecf5282134315143a81ee6f6612b966063ce5ef6cb94" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.294813 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.331161 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f4qps" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.402371 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9g5pj"] Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.405650 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" podUID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerName="dnsmasq-dns" containerID="cri-o://5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c" gracePeriod=10 Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.480217 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f1eb0-ad17-4bd0-b554-bff78a522559-operator-scripts\") pod \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.480368 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rgpd\" (UniqueName: \"kubernetes.io/projected/8b9f1eb0-ad17-4bd0-b554-bff78a522559-kube-api-access-5rgpd\") pod \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\" (UID: \"8b9f1eb0-ad17-4bd0-b554-bff78a522559\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.480682 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9f1eb0-ad17-4bd0-b554-bff78a522559-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b9f1eb0-ad17-4bd0-b554-bff78a522559" (UID: "8b9f1eb0-ad17-4bd0-b554-bff78a522559"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.480767 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f1eb0-ad17-4bd0-b554-bff78a522559-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.501477 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9f1eb0-ad17-4bd0-b554-bff78a522559-kube-api-access-5rgpd" (OuterVolumeSpecName: "kube-api-access-5rgpd") pod "8b9f1eb0-ad17-4bd0-b554-bff78a522559" (UID: "8b9f1eb0-ad17-4bd0-b554-bff78a522559"). InnerVolumeSpecName "kube-api-access-5rgpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.582516 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rgpd\" (UniqueName: \"kubernetes.io/projected/8b9f1eb0-ad17-4bd0-b554-bff78a522559-kube-api-access-5rgpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.644446 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.652340 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.655441 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-grk2z" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785040 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-operator-scripts\") pod \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785127 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5757\" (UniqueName: \"kubernetes.io/projected/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-kube-api-access-f5757\") pod \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785234 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7r8g\" (UniqueName: \"kubernetes.io/projected/d57f121c-42a8-4515-9b9f-f540a3a78b79-kube-api-access-w7r8g\") pod \"d57f121c-42a8-4515-9b9f-f540a3a78b79\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785264 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-operator-scripts\") pod \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\" (UID: \"b4fae6b8-8d43-4df5-b5c8-4482bf865a73\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785300 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57f121c-42a8-4515-9b9f-f540a3a78b79-operator-scripts\") pod \"d57f121c-42a8-4515-9b9f-f540a3a78b79\" (UID: \"d57f121c-42a8-4515-9b9f-f540a3a78b79\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785375 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn2ws\" (UniqueName: \"kubernetes.io/projected/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-kube-api-access-kn2ws\") pod \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\" (UID: \"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785494 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9ce0d4c-bd50-4466-83fb-68bea7c4ed61" (UID: "e9ce0d4c-bd50-4466-83fb-68bea7c4ed61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785723 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785754 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4fae6b8-8d43-4df5-b5c8-4482bf865a73" (UID: "b4fae6b8-8d43-4df5-b5c8-4482bf865a73"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.785891 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57f121c-42a8-4515-9b9f-f540a3a78b79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d57f121c-42a8-4515-9b9f-f540a3a78b79" (UID: "d57f121c-42a8-4515-9b9f-f540a3a78b79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.789259 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57f121c-42a8-4515-9b9f-f540a3a78b79-kube-api-access-w7r8g" (OuterVolumeSpecName: "kube-api-access-w7r8g") pod "d57f121c-42a8-4515-9b9f-f540a3a78b79" (UID: "d57f121c-42a8-4515-9b9f-f540a3a78b79"). InnerVolumeSpecName "kube-api-access-w7r8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.791041 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-kube-api-access-f5757" (OuterVolumeSpecName: "kube-api-access-f5757") pod "b4fae6b8-8d43-4df5-b5c8-4482bf865a73" (UID: "b4fae6b8-8d43-4df5-b5c8-4482bf865a73"). InnerVolumeSpecName "kube-api-access-f5757". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.791182 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-kube-api-access-kn2ws" (OuterVolumeSpecName: "kube-api-access-kn2ws") pod "e9ce0d4c-bd50-4466-83fb-68bea7c4ed61" (UID: "e9ce0d4c-bd50-4466-83fb-68bea7c4ed61"). InnerVolumeSpecName "kube-api-access-kn2ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.862625 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.888063 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5757\" (UniqueName: \"kubernetes.io/projected/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-kube-api-access-f5757\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.888091 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7r8g\" (UniqueName: \"kubernetes.io/projected/d57f121c-42a8-4515-9b9f-f540a3a78b79-kube-api-access-w7r8g\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.888100 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4fae6b8-8d43-4df5-b5c8-4482bf865a73-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.888109 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d57f121c-42a8-4515-9b9f-f540a3a78b79-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.888117 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn2ws\" (UniqueName: \"kubernetes.io/projected/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61-kube-api-access-kn2ws\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.904401 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ad9f-account-create-update-mv52x" event={"ID":"b4fae6b8-8d43-4df5-b5c8-4482bf865a73","Type":"ContainerDied","Data":"04cc1113a8c5817278fb3fc478d345dc6106d1c096bba7e8c1b2041c78c5d583"} Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.904430 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ad9f-account-create-update-mv52x" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.904450 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04cc1113a8c5817278fb3fc478d345dc6106d1c096bba7e8c1b2041c78c5d583" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.927895 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k9rgk"] Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.940873 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k9rgk"] Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.947937 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9b2b-account-create-update-s4768" event={"ID":"d57f121c-42a8-4515-9b9f-f540a3a78b79","Type":"ContainerDied","Data":"d050c12a178d17eb54c5d1c1ad925c009d57f2ded9de70c7e1625f095136ec7b"} Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.947978 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d050c12a178d17eb54c5d1c1ad925c009d57f2ded9de70c7e1625f095136ec7b" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.948046 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-s4768" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.960812 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-f4qps" event={"ID":"8b9f1eb0-ad17-4bd0-b554-bff78a522559","Type":"ContainerDied","Data":"5770e34f4af42f32f964e84b1ca1541ee4b6b63d4dbc16cb4b1e76b284e94364"} Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.960849 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5770e34f4af42f32f964e84b1ca1541ee4b6b63d4dbc16cb4b1e76b284e94364" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.960909 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-f4qps" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.978413 5127 generic.go:334] "Generic (PLEG): container finished" podID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerID="5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c" exitCode=0 Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.978489 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" event={"ID":"8a6e0a90-6e84-4065-89b5-fc45b01d5970","Type":"ContainerDied","Data":"5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c"} Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.978533 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" event={"ID":"8a6e0a90-6e84-4065-89b5-fc45b01d5970","Type":"ContainerDied","Data":"fa4343c8e4d7e6f16ba1e62365e74c02b75c7aaba0ceecbb4a05de979da46358"} Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.978548 5127 scope.go:117] "RemoveContainer" containerID="5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.978699 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-9g5pj" Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.992651 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7sc5\" (UniqueName: \"kubernetes.io/projected/8a6e0a90-6e84-4065-89b5-fc45b01d5970-kube-api-access-z7sc5\") pod \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.992747 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-config\") pod \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.993054 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-dns-svc\") pod \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\" (UID: \"8a6e0a90-6e84-4065-89b5-fc45b01d5970\") " Feb 01 07:08:01 crc kubenswrapper[5127]: I0201 07:08:01.998737 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a6e0a90-6e84-4065-89b5-fc45b01d5970-kube-api-access-z7sc5" (OuterVolumeSpecName: "kube-api-access-z7sc5") pod "8a6e0a90-6e84-4065-89b5-fc45b01d5970" (UID: "8a6e0a90-6e84-4065-89b5-fc45b01d5970"). InnerVolumeSpecName "kube-api-access-z7sc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.004520 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-grk2z" event={"ID":"e9ce0d4c-bd50-4466-83fb-68bea7c4ed61","Type":"ContainerDied","Data":"82ecfbeff12ad61ded679175e53117d962d40152751f7d15bc2f3e6c38faffde"} Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.004569 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ecfbeff12ad61ded679175e53117d962d40152751f7d15bc2f3e6c38faffde" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.004660 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-grk2z" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.066048 5127 scope.go:117] "RemoveContainer" containerID="eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.072050 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a6e0a90-6e84-4065-89b5-fc45b01d5970" (UID: "8a6e0a90-6e84-4065-89b5-fc45b01d5970"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.079479 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-config" (OuterVolumeSpecName: "config") pod "8a6e0a90-6e84-4065-89b5-fc45b01d5970" (UID: "8a6e0a90-6e84-4065-89b5-fc45b01d5970"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.088058 5127 scope.go:117] "RemoveContainer" containerID="5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c" Feb 01 07:08:02 crc kubenswrapper[5127]: E0201 07:08:02.088406 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c\": container with ID starting with 5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c not found: ID does not exist" containerID="5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.088428 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c"} err="failed to get container status \"5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c\": rpc error: code = NotFound desc = could not find container \"5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c\": container with ID starting with 5e916e5af7942b0e0fdb37d68c8d0813cf10974ab4b33a0c1a593f346f04928c not found: ID does not exist" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.088449 5127 scope.go:117] "RemoveContainer" containerID="eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf" Feb 01 07:08:02 crc kubenswrapper[5127]: E0201 07:08:02.089039 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf\": container with ID starting with eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf not found: ID does not exist" containerID="eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.089113 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf"} err="failed to get container status \"eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf\": rpc error: code = NotFound desc = could not find container \"eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf\": container with ID starting with eae764ada4ea7c5287d35c68cdf7fd7ed5b50c6aef3619c4591a3ffb2e942edf not found: ID does not exist" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.095768 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.095797 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a6e0a90-6e84-4065-89b5-fc45b01d5970-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.095807 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7sc5\" (UniqueName: \"kubernetes.io/projected/8a6e0a90-6e84-4065-89b5-fc45b01d5970-kube-api-access-z7sc5\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.247684 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d3831b-3f13-441c-b809-40428cfd7b4b" path="/var/lib/kubelet/pods/72d3831b-3f13-441c-b809-40428cfd7b4b/volumes" Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.313490 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9g5pj"] Feb 01 07:08:02 crc kubenswrapper[5127]: I0201 07:08:02.320138 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-9g5pj"] Feb 01 07:08:03 crc kubenswrapper[5127]: I0201 07:08:03.015373 5127 generic.go:334] "Generic (PLEG): container finished" podID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerID="c82dbbe0eb6ca71336161015ea284573bf1cf53a6b5fb5824650267c1ab2d8a7" exitCode=0 Feb 01 07:08:03 crc kubenswrapper[5127]: I0201 07:08:03.015416 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"23799dc8-9944-4c3d-a0e1-cf99f5cb7998","Type":"ContainerDied","Data":"c82dbbe0eb6ca71336161015ea284573bf1cf53a6b5fb5824650267c1ab2d8a7"} Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.024165 5127 generic.go:334] "Generic (PLEG): container finished" podID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerID="d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028" exitCode=0 Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.024275 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"824fc658-1c02-4470-9ed3-e4123ddd7575","Type":"ContainerDied","Data":"d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028"} Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.027553 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"23799dc8-9944-4c3d-a0e1-cf99f5cb7998","Type":"ContainerStarted","Data":"c7124acccfb475012da9199fb03b9140f599ad33e538d0d8d5c664659f9b893f"} Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.028314 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.029660 5127 generic.go:334] "Generic (PLEG): container finished" podID="379e85af-3108-4c83-88cb-a71948674382" containerID="9736798c00ea577ff511a799e202c624f7065f1a456d8c24e360b90f890a6de7" exitCode=0 Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.029705 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-47sg7" event={"ID":"379e85af-3108-4c83-88cb-a71948674382","Type":"ContainerDied","Data":"9736798c00ea577ff511a799e202c624f7065f1a456d8c24e360b90f890a6de7"} Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.106231 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.134476937 podStartE2EDuration="51.106213637s" podCreationTimestamp="2026-02-01 07:07:13 +0000 UTC" firstStartedPulling="2026-02-01 07:07:18.938258354 +0000 UTC m=+1189.424160747" lastFinishedPulling="2026-02-01 07:07:29.909995084 +0000 UTC m=+1200.395897447" observedRunningTime="2026-02-01 07:08:04.099211197 +0000 UTC m=+1234.585113560" watchObservedRunningTime="2026-02-01 07:08:04.106213637 +0000 UTC m=+1234.592116000" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162032 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-4qmcj"] Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.162830 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4fae6b8-8d43-4df5-b5c8-4482bf865a73" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162851 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4fae6b8-8d43-4df5-b5c8-4482bf865a73" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.162874 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerName="init" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162886 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerName="init" Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.162904 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ad9d98-24ab-40fe-ac49-63b423cd33de" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162912 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ad9d98-24ab-40fe-ac49-63b423cd33de" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.162924 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57f121c-42a8-4515-9b9f-f540a3a78b79" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162931 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57f121c-42a8-4515-9b9f-f540a3a78b79" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.162946 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d3831b-3f13-441c-b809-40428cfd7b4b" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162953 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d3831b-3f13-441c-b809-40428cfd7b4b" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.162966 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ce0d4c-bd50-4466-83fb-68bea7c4ed61" containerName="mariadb-database-create" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162975 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ce0d4c-bd50-4466-83fb-68bea7c4ed61" containerName="mariadb-database-create" Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.162988 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f1eb0-ad17-4bd0-b554-bff78a522559" containerName="mariadb-database-create" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.162995 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f1eb0-ad17-4bd0-b554-bff78a522559" containerName="mariadb-database-create" Feb 01 07:08:04 crc kubenswrapper[5127]: E0201 07:08:04.163007 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerName="dnsmasq-dns" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163015 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerName="dnsmasq-dns" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163210 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ad9d98-24ab-40fe-ac49-63b423cd33de" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163227 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ce0d4c-bd50-4466-83fb-68bea7c4ed61" containerName="mariadb-database-create" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163240 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" containerName="dnsmasq-dns" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163256 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57f121c-42a8-4515-9b9f-f540a3a78b79" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163269 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d3831b-3f13-441c-b809-40428cfd7b4b" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163282 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f1eb0-ad17-4bd0-b554-bff78a522559" containerName="mariadb-database-create" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.163292 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4fae6b8-8d43-4df5-b5c8-4482bf865a73" containerName="mariadb-account-create-update" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.164003 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.166326 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lkmxs" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.166647 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.170898 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4qmcj"] Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.253273 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-config-data\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.253330 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-db-sync-config-data\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.253407 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-combined-ca-bundle\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.253442 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc64l\" (UniqueName: \"kubernetes.io/projected/dc5df89b-8911-4464-8b7f-c9716a7243ea-kube-api-access-vc64l\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.255129 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a6e0a90-6e84-4065-89b5-fc45b01d5970" path="/var/lib/kubelet/pods/8a6e0a90-6e84-4065-89b5-fc45b01d5970/volumes" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.282543 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.355431 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc64l\" (UniqueName: \"kubernetes.io/projected/dc5df89b-8911-4464-8b7f-c9716a7243ea-kube-api-access-vc64l\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.355539 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-config-data\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.355607 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-db-sync-config-data\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.355802 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-combined-ca-bundle\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.360132 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-db-sync-config-data\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.360516 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-config-data\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.365254 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-combined-ca-bundle\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.373305 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc64l\" (UniqueName: \"kubernetes.io/projected/dc5df89b-8911-4464-8b7f-c9716a7243ea-kube-api-access-vc64l\") pod \"glance-db-sync-4qmcj\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:04 crc kubenswrapper[5127]: I0201 07:08:04.580820 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.039478 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"824fc658-1c02-4470-9ed3-e4123ddd7575","Type":"ContainerStarted","Data":"9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395"} Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.040496 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.065886 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.609723452 podStartE2EDuration="51.065864705s" podCreationTimestamp="2026-02-01 07:07:14 +0000 UTC" firstStartedPulling="2026-02-01 07:07:29.856413974 +0000 UTC m=+1200.342316357" lastFinishedPulling="2026-02-01 07:07:30.312555247 +0000 UTC m=+1200.798457610" observedRunningTime="2026-02-01 07:08:05.063256404 +0000 UTC m=+1235.549158767" watchObservedRunningTime="2026-02-01 07:08:05.065864705 +0000 UTC m=+1235.551767068" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.210598 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-4qmcj"] Feb 01 07:08:05 crc kubenswrapper[5127]: W0201 07:08:05.221809 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc5df89b_8911_4464_8b7f_c9716a7243ea.slice/crio-41affb1684453fd055a42bb1f6f18efe33967c96ac76ea1bfb1615f64cc276a6 WatchSource:0}: Error finding container 41affb1684453fd055a42bb1f6f18efe33967c96ac76ea1bfb1615f64cc276a6: Status 404 returned error can't find the container with id 41affb1684453fd055a42bb1f6f18efe33967c96ac76ea1bfb1615f64cc276a6 Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.401162 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.573135 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqmtr\" (UniqueName: \"kubernetes.io/projected/379e85af-3108-4c83-88cb-a71948674382-kube-api-access-fqmtr\") pod \"379e85af-3108-4c83-88cb-a71948674382\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.573230 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-swiftconf\") pod \"379e85af-3108-4c83-88cb-a71948674382\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.573270 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/379e85af-3108-4c83-88cb-a71948674382-etc-swift\") pod \"379e85af-3108-4c83-88cb-a71948674382\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.573312 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-dispersionconf\") pod \"379e85af-3108-4c83-88cb-a71948674382\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.573346 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-combined-ca-bundle\") pod \"379e85af-3108-4c83-88cb-a71948674382\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.573366 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-scripts\") pod \"379e85af-3108-4c83-88cb-a71948674382\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.573438 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-ring-data-devices\") pod \"379e85af-3108-4c83-88cb-a71948674382\" (UID: \"379e85af-3108-4c83-88cb-a71948674382\") " Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.574069 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "379e85af-3108-4c83-88cb-a71948674382" (UID: "379e85af-3108-4c83-88cb-a71948674382"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.575012 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379e85af-3108-4c83-88cb-a71948674382-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "379e85af-3108-4c83-88cb-a71948674382" (UID: "379e85af-3108-4c83-88cb-a71948674382"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.583875 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379e85af-3108-4c83-88cb-a71948674382-kube-api-access-fqmtr" (OuterVolumeSpecName: "kube-api-access-fqmtr") pod "379e85af-3108-4c83-88cb-a71948674382" (UID: "379e85af-3108-4c83-88cb-a71948674382"). InnerVolumeSpecName "kube-api-access-fqmtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.595897 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "379e85af-3108-4c83-88cb-a71948674382" (UID: "379e85af-3108-4c83-88cb-a71948674382"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.596877 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "379e85af-3108-4c83-88cb-a71948674382" (UID: "379e85af-3108-4c83-88cb-a71948674382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.608973 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "379e85af-3108-4c83-88cb-a71948674382" (UID: "379e85af-3108-4c83-88cb-a71948674382"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.622492 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-scripts" (OuterVolumeSpecName: "scripts") pod "379e85af-3108-4c83-88cb-a71948674382" (UID: "379e85af-3108-4c83-88cb-a71948674382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.675422 5127 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.675458 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.675468 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.675477 5127 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/379e85af-3108-4c83-88cb-a71948674382-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.675486 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqmtr\" (UniqueName: \"kubernetes.io/projected/379e85af-3108-4c83-88cb-a71948674382-kube-api-access-fqmtr\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.675496 5127 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/379e85af-3108-4c83-88cb-a71948674382-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:05 crc kubenswrapper[5127]: I0201 07:08:05.675503 5127 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/379e85af-3108-4c83-88cb-a71948674382-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.057574 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4qmcj" event={"ID":"dc5df89b-8911-4464-8b7f-c9716a7243ea","Type":"ContainerStarted","Data":"41affb1684453fd055a42bb1f6f18efe33967c96ac76ea1bfb1615f64cc276a6"} Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.058942 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-47sg7" event={"ID":"379e85af-3108-4c83-88cb-a71948674382","Type":"ContainerDied","Data":"652f2804146360ea2f589553c7139013d47a40efd238c2b78ec9967555a30f9a"} Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.058981 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="652f2804146360ea2f589553c7139013d47a40efd238c2b78ec9967555a30f9a" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.058982 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-47sg7" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.740800 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.741349 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.921230 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jmcld"] Feb 01 07:08:06 crc kubenswrapper[5127]: E0201 07:08:06.921648 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379e85af-3108-4c83-88cb-a71948674382" containerName="swift-ring-rebalance" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.921671 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="379e85af-3108-4c83-88cb-a71948674382" containerName="swift-ring-rebalance" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.921848 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="379e85af-3108-4c83-88cb-a71948674382" containerName="swift-ring-rebalance" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.922463 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.926920 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 01 07:08:06 crc kubenswrapper[5127]: I0201 07:08:06.939425 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jmcld"] Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.108009 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfxmw\" (UniqueName: \"kubernetes.io/projected/8876b764-7eab-4430-ae8d-b0d88f3f4394-kube-api-access-pfxmw\") pod \"root-account-create-update-jmcld\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.108087 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8876b764-7eab-4430-ae8d-b0d88f3f4394-operator-scripts\") pod \"root-account-create-update-jmcld\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.209831 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfxmw\" (UniqueName: \"kubernetes.io/projected/8876b764-7eab-4430-ae8d-b0d88f3f4394-kube-api-access-pfxmw\") pod \"root-account-create-update-jmcld\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.209976 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8876b764-7eab-4430-ae8d-b0d88f3f4394-operator-scripts\") pod \"root-account-create-update-jmcld\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.211428 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8876b764-7eab-4430-ae8d-b0d88f3f4394-operator-scripts\") pod \"root-account-create-update-jmcld\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.241611 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfxmw\" (UniqueName: \"kubernetes.io/projected/8876b764-7eab-4430-ae8d-b0d88f3f4394-kube-api-access-pfxmw\") pod \"root-account-create-update-jmcld\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.254184 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.549655 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jmcld"] Feb 01 07:08:07 crc kubenswrapper[5127]: W0201 07:08:07.555245 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8876b764_7eab_4430_ae8d_b0d88f3f4394.slice/crio-b0af91130b3a302247e40ffb5d7ab94e2bfc8093c4714759866835c294f225c6 WatchSource:0}: Error finding container b0af91130b3a302247e40ffb5d7ab94e2bfc8093c4714759866835c294f225c6: Status 404 returned error can't find the container with id b0af91130b3a302247e40ffb5d7ab94e2bfc8093c4714759866835c294f225c6 Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.924902 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:08:07 crc kubenswrapper[5127]: I0201 07:08:07.936996 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"swift-storage-0\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " pod="openstack/swift-storage-0" Feb 01 07:08:07 crc kubenswrapper[5127]: E0201 07:08:07.982450 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8876b764_7eab_4430_ae8d_b0d88f3f4394.slice/crio-050ffe26fe0782c1f8580da18ef585a4bde646662286075d7a19ca9b46fc1466.scope\": RecentStats: unable to find data in memory cache]" Feb 01 07:08:08 crc kubenswrapper[5127]: I0201 07:08:08.078077 5127 generic.go:334] "Generic (PLEG): container finished" podID="8876b764-7eab-4430-ae8d-b0d88f3f4394" containerID="050ffe26fe0782c1f8580da18ef585a4bde646662286075d7a19ca9b46fc1466" exitCode=0 Feb 01 07:08:08 crc kubenswrapper[5127]: I0201 07:08:08.078133 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jmcld" event={"ID":"8876b764-7eab-4430-ae8d-b0d88f3f4394","Type":"ContainerDied","Data":"050ffe26fe0782c1f8580da18ef585a4bde646662286075d7a19ca9b46fc1466"} Feb 01 07:08:08 crc kubenswrapper[5127]: I0201 07:08:08.078178 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jmcld" event={"ID":"8876b764-7eab-4430-ae8d-b0d88f3f4394","Type":"ContainerStarted","Data":"b0af91130b3a302247e40ffb5d7ab94e2bfc8093c4714759866835c294f225c6"} Feb 01 07:08:08 crc kubenswrapper[5127]: I0201 07:08:08.120971 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 01 07:08:08 crc kubenswrapper[5127]: I0201 07:08:08.784106 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.087972 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"4c8bb7179e7ba51f0ef68dc2d28ca9439c5e28f50cae2faea0e11c50c1fdfc5d"} Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.454150 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.472244 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8876b764-7eab-4430-ae8d-b0d88f3f4394-operator-scripts\") pod \"8876b764-7eab-4430-ae8d-b0d88f3f4394\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.472424 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfxmw\" (UniqueName: \"kubernetes.io/projected/8876b764-7eab-4430-ae8d-b0d88f3f4394-kube-api-access-pfxmw\") pod \"8876b764-7eab-4430-ae8d-b0d88f3f4394\" (UID: \"8876b764-7eab-4430-ae8d-b0d88f3f4394\") " Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.473141 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8876b764-7eab-4430-ae8d-b0d88f3f4394-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8876b764-7eab-4430-ae8d-b0d88f3f4394" (UID: "8876b764-7eab-4430-ae8d-b0d88f3f4394"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.479095 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8876b764-7eab-4430-ae8d-b0d88f3f4394-kube-api-access-pfxmw" (OuterVolumeSpecName: "kube-api-access-pfxmw") pod "8876b764-7eab-4430-ae8d-b0d88f3f4394" (UID: "8876b764-7eab-4430-ae8d-b0d88f3f4394"). InnerVolumeSpecName "kube-api-access-pfxmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.574010 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfxmw\" (UniqueName: \"kubernetes.io/projected/8876b764-7eab-4430-ae8d-b0d88f3f4394-kube-api-access-pfxmw\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.574053 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8876b764-7eab-4430-ae8d-b0d88f3f4394-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:09 crc kubenswrapper[5127]: I0201 07:08:09.949920 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hqn86" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" probeResult="failure" output=< Feb 01 07:08:09 crc kubenswrapper[5127]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 01 07:08:09 crc kubenswrapper[5127]: > Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.005103 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.015452 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.096823 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jmcld" event={"ID":"8876b764-7eab-4430-ae8d-b0d88f3f4394","Type":"ContainerDied","Data":"b0af91130b3a302247e40ffb5d7ab94e2bfc8093c4714759866835c294f225c6"} Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.097127 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0af91130b3a302247e40ffb5d7ab94e2bfc8093c4714759866835c294f225c6" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.096862 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jmcld" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.224170 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hqn86-config-mlw52"] Feb 01 07:08:10 crc kubenswrapper[5127]: E0201 07:08:10.224873 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8876b764-7eab-4430-ae8d-b0d88f3f4394" containerName="mariadb-account-create-update" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.225000 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8876b764-7eab-4430-ae8d-b0d88f3f4394" containerName="mariadb-account-create-update" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.225280 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8876b764-7eab-4430-ae8d-b0d88f3f4394" containerName="mariadb-account-create-update" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.226018 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.236305 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.272649 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hqn86-config-mlw52"] Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.386336 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-scripts\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.386423 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-log-ovn\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.386608 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-additional-scripts\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.386642 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97wnv\" (UniqueName: \"kubernetes.io/projected/84afae77-cbb7-4363-a302-6aca80f9ceba-kube-api-access-97wnv\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.386675 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.386791 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run-ovn\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.489656 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run-ovn\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.489776 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-scripts\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.489852 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-log-ovn\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.489950 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-additional-scripts\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.489994 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97wnv\" (UniqueName: \"kubernetes.io/projected/84afae77-cbb7-4363-a302-6aca80f9ceba-kube-api-access-97wnv\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.490047 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.490503 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.490616 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run-ovn\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.493158 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-scripts\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.493479 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-additional-scripts\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.493514 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-log-ovn\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.510282 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97wnv\" (UniqueName: \"kubernetes.io/projected/84afae77-cbb7-4363-a302-6aca80f9ceba-kube-api-access-97wnv\") pod \"ovn-controller-hqn86-config-mlw52\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:10 crc kubenswrapper[5127]: I0201 07:08:10.558988 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:14 crc kubenswrapper[5127]: I0201 07:08:14.792894 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 01 07:08:14 crc kubenswrapper[5127]: I0201 07:08:14.958238 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hqn86" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" probeResult="failure" output=< Feb 01 07:08:14 crc kubenswrapper[5127]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 01 07:08:14 crc kubenswrapper[5127]: > Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.163764 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jr4gn"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.164697 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.179669 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jr4gn"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.260362 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6px8t"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.262238 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.285849 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12387291-7208-4df1-b142-486a24065f71-operator-scripts\") pod \"cinder-db-create-jr4gn\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.285970 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gph\" (UniqueName: \"kubernetes.io/projected/12387291-7208-4df1-b142-486a24065f71-kube-api-access-27gph\") pod \"cinder-db-create-jr4gn\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.307049 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6px8t"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.313203 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b29e-account-create-update-tsc7k"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.314942 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.318112 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.344755 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b29e-account-create-update-tsc7k"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.366772 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c35b-account-create-update-fclsw"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.368150 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.370165 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.376387 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c35b-account-create-update-fclsw"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.387525 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e86c9ae-529c-41fd-89e6-08de90de4684-operator-scripts\") pod \"barbican-db-create-6px8t\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.387594 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gph\" (UniqueName: \"kubernetes.io/projected/12387291-7208-4df1-b142-486a24065f71-kube-api-access-27gph\") pod \"cinder-db-create-jr4gn\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.387701 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwzfg\" (UniqueName: \"kubernetes.io/projected/6e86c9ae-529c-41fd-89e6-08de90de4684-kube-api-access-pwzfg\") pod \"barbican-db-create-6px8t\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.387864 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12387291-7208-4df1-b142-486a24065f71-operator-scripts\") pod \"cinder-db-create-jr4gn\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.393562 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12387291-7208-4df1-b142-486a24065f71-operator-scripts\") pod \"cinder-db-create-jr4gn\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.422360 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gph\" (UniqueName: \"kubernetes.io/projected/12387291-7208-4df1-b142-486a24065f71-kube-api-access-27gph\") pod \"cinder-db-create-jr4gn\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.428751 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-v4dp5"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.430151 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.436515 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.436848 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7vjtc" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.437064 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.437209 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.443112 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v4dp5"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.476969 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-l2pjk"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.478057 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.486125 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.488846 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwzfg\" (UniqueName: \"kubernetes.io/projected/6e86c9ae-529c-41fd-89e6-08de90de4684-kube-api-access-pwzfg\") pod \"barbican-db-create-6px8t\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.488880 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vm24\" (UniqueName: \"kubernetes.io/projected/afdf7282-6160-40d2-b9fa-803bf081e1b6-kube-api-access-9vm24\") pod \"cinder-c35b-account-create-update-fclsw\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.488934 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45bf5410-d21e-44a3-b4c7-11fdd25902d0-operator-scripts\") pod \"barbican-b29e-account-create-update-tsc7k\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.488959 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2tp5\" (UniqueName: \"kubernetes.io/projected/45bf5410-d21e-44a3-b4c7-11fdd25902d0-kube-api-access-h2tp5\") pod \"barbican-b29e-account-create-update-tsc7k\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.489019 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afdf7282-6160-40d2-b9fa-803bf081e1b6-operator-scripts\") pod \"cinder-c35b-account-create-update-fclsw\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.489037 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e86c9ae-529c-41fd-89e6-08de90de4684-operator-scripts\") pod \"barbican-db-create-6px8t\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.489820 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e86c9ae-529c-41fd-89e6-08de90de4684-operator-scripts\") pod \"barbican-db-create-6px8t\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.510029 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-l2pjk"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.523772 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.530649 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-293b-account-create-update-9chp8"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.532025 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.534693 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.564743 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwzfg\" (UniqueName: \"kubernetes.io/projected/6e86c9ae-529c-41fd-89e6-08de90de4684-kube-api-access-pwzfg\") pod \"barbican-db-create-6px8t\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.566216 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-293b-account-create-update-9chp8"] Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.590846 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45bf5410-d21e-44a3-b4c7-11fdd25902d0-operator-scripts\") pod \"barbican-b29e-account-create-update-tsc7k\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.590893 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-combined-ca-bundle\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.590914 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2tp5\" (UniqueName: \"kubernetes.io/projected/45bf5410-d21e-44a3-b4c7-11fdd25902d0-kube-api-access-h2tp5\") pod \"barbican-b29e-account-create-update-tsc7k\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.590970 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afdf7282-6160-40d2-b9fa-803bf081e1b6-operator-scripts\") pod \"cinder-c35b-account-create-update-fclsw\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.590997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c64132aa-a148-422a-8d80-b92f9005a34f-operator-scripts\") pod \"neutron-db-create-l2pjk\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.591032 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htt7m\" (UniqueName: \"kubernetes.io/projected/c64132aa-a148-422a-8d80-b92f9005a34f-kube-api-access-htt7m\") pod \"neutron-db-create-l2pjk\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.591048 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-config-data\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.591079 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vm24\" (UniqueName: \"kubernetes.io/projected/afdf7282-6160-40d2-b9fa-803bf081e1b6-kube-api-access-9vm24\") pod \"cinder-c35b-account-create-update-fclsw\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.591117 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdz24\" (UniqueName: \"kubernetes.io/projected/cf149136-6376-4e36-96c8-ed8680852c66-kube-api-access-wdz24\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.591761 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45bf5410-d21e-44a3-b4c7-11fdd25902d0-operator-scripts\") pod \"barbican-b29e-account-create-update-tsc7k\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.592445 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afdf7282-6160-40d2-b9fa-803bf081e1b6-operator-scripts\") pod \"cinder-c35b-account-create-update-fclsw\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.601461 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.643350 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vm24\" (UniqueName: \"kubernetes.io/projected/afdf7282-6160-40d2-b9fa-803bf081e1b6-kube-api-access-9vm24\") pod \"cinder-c35b-account-create-update-fclsw\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.648975 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2tp5\" (UniqueName: \"kubernetes.io/projected/45bf5410-d21e-44a3-b4c7-11fdd25902d0-kube-api-access-h2tp5\") pod \"barbican-b29e-account-create-update-tsc7k\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.691539 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.692563 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c64132aa-a148-422a-8d80-b92f9005a34f-operator-scripts\") pod \"neutron-db-create-l2pjk\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.692676 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htt7m\" (UniqueName: \"kubernetes.io/projected/c64132aa-a148-422a-8d80-b92f9005a34f-kube-api-access-htt7m\") pod \"neutron-db-create-l2pjk\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.692695 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-config-data\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.692728 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-operator-scripts\") pod \"neutron-293b-account-create-update-9chp8\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.692757 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvpss\" (UniqueName: \"kubernetes.io/projected/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-kube-api-access-cvpss\") pod \"neutron-293b-account-create-update-9chp8\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.692809 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdz24\" (UniqueName: \"kubernetes.io/projected/cf149136-6376-4e36-96c8-ed8680852c66-kube-api-access-wdz24\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.692856 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-combined-ca-bundle\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.693802 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c64132aa-a148-422a-8d80-b92f9005a34f-operator-scripts\") pod \"neutron-db-create-l2pjk\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.704400 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-config-data\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.704920 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-combined-ca-bundle\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.725189 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htt7m\" (UniqueName: \"kubernetes.io/projected/c64132aa-a148-422a-8d80-b92f9005a34f-kube-api-access-htt7m\") pod \"neutron-db-create-l2pjk\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.741002 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdz24\" (UniqueName: \"kubernetes.io/projected/cf149136-6376-4e36-96c8-ed8680852c66-kube-api-access-wdz24\") pod \"keystone-db-sync-v4dp5\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.794856 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-operator-scripts\") pod \"neutron-293b-account-create-update-9chp8\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.795204 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvpss\" (UniqueName: \"kubernetes.io/projected/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-kube-api-access-cvpss\") pod \"neutron-293b-account-create-update-9chp8\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.795432 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.795733 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-operator-scripts\") pod \"neutron-293b-account-create-update-9chp8\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.815933 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.821450 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvpss\" (UniqueName: \"kubernetes.io/projected/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-kube-api-access-cvpss\") pod \"neutron-293b-account-create-update-9chp8\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.884688 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:15 crc kubenswrapper[5127]: I0201 07:08:15.937216 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:17 crc kubenswrapper[5127]: I0201 07:08:17.713725 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jr4gn"] Feb 01 07:08:17 crc kubenswrapper[5127]: I0201 07:08:17.727914 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6px8t"] Feb 01 07:08:17 crc kubenswrapper[5127]: W0201 07:08:17.738556 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45bf5410_d21e_44a3_b4c7_11fdd25902d0.slice/crio-32066304f9830bd73d9a47a561e0eeab2f13f0be954520deb5f1bc8fdcfa114d WatchSource:0}: Error finding container 32066304f9830bd73d9a47a561e0eeab2f13f0be954520deb5f1bc8fdcfa114d: Status 404 returned error can't find the container with id 32066304f9830bd73d9a47a561e0eeab2f13f0be954520deb5f1bc8fdcfa114d Feb 01 07:08:17 crc kubenswrapper[5127]: I0201 07:08:17.753257 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b29e-account-create-update-tsc7k"] Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.088082 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-l2pjk"] Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.096155 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hqn86-config-mlw52"] Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.113022 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-293b-account-create-update-9chp8"] Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.124948 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-v4dp5"] Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.147804 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c35b-account-create-update-fclsw"] Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.183151 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-293b-account-create-update-9chp8" event={"ID":"fe1b844d-6fec-4f41-83d3-62fe94a2aa43","Type":"ContainerStarted","Data":"fac5f09a331a6834d64fd24562297757fbe2e4037fadc6009cc79a87241470c9"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.185902 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4qmcj" event={"ID":"dc5df89b-8911-4464-8b7f-c9716a7243ea","Type":"ContainerStarted","Data":"affbafd2f2595fb216b991e56f47604b5bf279799e0e1cdba3730599a76111ab"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.194711 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"38b76cedf92a4bf003f4c614f64605b3a7cbd585d2e9ecb5e1043de091b2dd25"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.194750 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"2ba3574e531a65aa332d467e2a747abc31633121e49ff04e0c8f64ec009d6670"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.194763 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"d7b1d3dad0001903399762e7d439bb31968d3d63d3ab70bde24fdd9f1e6316ee"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.194771 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"867db1559afad96de83c300d6dc76b9f79d3c9220b0e2eb9728b097d71713a33"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.197074 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6px8t" event={"ID":"6e86c9ae-529c-41fd-89e6-08de90de4684","Type":"ContainerStarted","Data":"dbeddcf12ac584f74f1a4862f408fe746c0102cea65f98e31f2a82882322563d"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.197102 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6px8t" event={"ID":"6e86c9ae-529c-41fd-89e6-08de90de4684","Type":"ContainerStarted","Data":"9102994da7ecd47aa11bdc8f4f6018c7de7a7d2c5996a5be25aa9fac3b99726f"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.198544 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b29e-account-create-update-tsc7k" event={"ID":"45bf5410-d21e-44a3-b4c7-11fdd25902d0","Type":"ContainerStarted","Data":"69e083b16f8b5c507913fa804b7586d48a866787dd25aea8f10ed56919839621"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.198568 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b29e-account-create-update-tsc7k" event={"ID":"45bf5410-d21e-44a3-b4c7-11fdd25902d0","Type":"ContainerStarted","Data":"32066304f9830bd73d9a47a561e0eeab2f13f0be954520deb5f1bc8fdcfa114d"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.203667 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l2pjk" event={"ID":"c64132aa-a148-422a-8d80-b92f9005a34f","Type":"ContainerStarted","Data":"e154237f4b97122d8fff31c4121f6d0ca357df5e856640f9ed9412db59f9f076"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.205549 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-4qmcj" podStartSLOduration=2.224965212 podStartE2EDuration="14.20552847s" podCreationTimestamp="2026-02-01 07:08:04 +0000 UTC" firstStartedPulling="2026-02-01 07:08:05.223626715 +0000 UTC m=+1235.709529068" lastFinishedPulling="2026-02-01 07:08:17.204189963 +0000 UTC m=+1247.690092326" observedRunningTime="2026-02-01 07:08:18.199752714 +0000 UTC m=+1248.685655077" watchObservedRunningTime="2026-02-01 07:08:18.20552847 +0000 UTC m=+1248.691430833" Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.211450 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jr4gn" event={"ID":"12387291-7208-4df1-b142-486a24065f71","Type":"ContainerStarted","Data":"f7d727be361972646bce2059917ea1d6779b17a04bb30dbfaa2297ebf735b61e"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.211539 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jr4gn" event={"ID":"12387291-7208-4df1-b142-486a24065f71","Type":"ContainerStarted","Data":"69a0bf34b35ce7c891e5cffc3425ea61ad88991643d9990fcc647bd380c00cad"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.214630 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86-config-mlw52" event={"ID":"84afae77-cbb7-4363-a302-6aca80f9ceba","Type":"ContainerStarted","Data":"3f2d1c096986010095a02bf80cf8c7a795fb54cac9e31256859e196612c95f1c"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.218320 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v4dp5" event={"ID":"cf149136-6376-4e36-96c8-ed8680852c66","Type":"ContainerStarted","Data":"b35b2d8b740ef12019cc6edda52cb6cd53a2f7d7fb856f34719af4964959f9fd"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.221882 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-6px8t" podStartSLOduration=3.221862481 podStartE2EDuration="3.221862481s" podCreationTimestamp="2026-02-01 07:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:18.215022316 +0000 UTC m=+1248.700924679" watchObservedRunningTime="2026-02-01 07:08:18.221862481 +0000 UTC m=+1248.707764844" Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.226422 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c35b-account-create-update-fclsw" event={"ID":"afdf7282-6160-40d2-b9fa-803bf081e1b6","Type":"ContainerStarted","Data":"3bfb2b70e5f792abca8b3b4d88406da4c3a03a9b12f1b2ae492bac509ff2d34e"} Feb 01 07:08:18 crc kubenswrapper[5127]: I0201 07:08:18.241204 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b29e-account-create-update-tsc7k" podStartSLOduration=3.241181424 podStartE2EDuration="3.241181424s" podCreationTimestamp="2026-02-01 07:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:18.236412776 +0000 UTC m=+1248.722315139" watchObservedRunningTime="2026-02-01 07:08:18.241181424 +0000 UTC m=+1248.727083797" Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.238189 5127 generic.go:334] "Generic (PLEG): container finished" podID="84afae77-cbb7-4363-a302-6aca80f9ceba" containerID="edfa52efc00b790e97ee37c12541b062c301d11ef0289437fc3e2801ea12e4c2" exitCode=0 Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.238242 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86-config-mlw52" event={"ID":"84afae77-cbb7-4363-a302-6aca80f9ceba","Type":"ContainerDied","Data":"edfa52efc00b790e97ee37c12541b062c301d11ef0289437fc3e2801ea12e4c2"} Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.241760 5127 generic.go:334] "Generic (PLEG): container finished" podID="6e86c9ae-529c-41fd-89e6-08de90de4684" containerID="dbeddcf12ac584f74f1a4862f408fe746c0102cea65f98e31f2a82882322563d" exitCode=0 Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.241826 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6px8t" event={"ID":"6e86c9ae-529c-41fd-89e6-08de90de4684","Type":"ContainerDied","Data":"dbeddcf12ac584f74f1a4862f408fe746c0102cea65f98e31f2a82882322563d"} Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.246026 5127 generic.go:334] "Generic (PLEG): container finished" podID="45bf5410-d21e-44a3-b4c7-11fdd25902d0" containerID="69e083b16f8b5c507913fa804b7586d48a866787dd25aea8f10ed56919839621" exitCode=0 Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.246092 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b29e-account-create-update-tsc7k" event={"ID":"45bf5410-d21e-44a3-b4c7-11fdd25902d0","Type":"ContainerDied","Data":"69e083b16f8b5c507913fa804b7586d48a866787dd25aea8f10ed56919839621"} Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.249544 5127 generic.go:334] "Generic (PLEG): container finished" podID="c64132aa-a148-422a-8d80-b92f9005a34f" containerID="bea40ed57ae34e892c6123f528c511c5b0399243d5f26fad517fdb141bad12ef" exitCode=0 Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.249645 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l2pjk" event={"ID":"c64132aa-a148-422a-8d80-b92f9005a34f","Type":"ContainerDied","Data":"bea40ed57ae34e892c6123f528c511c5b0399243d5f26fad517fdb141bad12ef"} Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.252220 5127 generic.go:334] "Generic (PLEG): container finished" podID="afdf7282-6160-40d2-b9fa-803bf081e1b6" containerID="9f5b2e615ef47dfbbbd7c69e708ad20e93453ce23c36b600cace53513255813f" exitCode=0 Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.252284 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c35b-account-create-update-fclsw" event={"ID":"afdf7282-6160-40d2-b9fa-803bf081e1b6","Type":"ContainerDied","Data":"9f5b2e615ef47dfbbbd7c69e708ad20e93453ce23c36b600cace53513255813f"} Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.256655 5127 generic.go:334] "Generic (PLEG): container finished" podID="12387291-7208-4df1-b142-486a24065f71" containerID="f7d727be361972646bce2059917ea1d6779b17a04bb30dbfaa2297ebf735b61e" exitCode=0 Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.256718 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jr4gn" event={"ID":"12387291-7208-4df1-b142-486a24065f71","Type":"ContainerDied","Data":"f7d727be361972646bce2059917ea1d6779b17a04bb30dbfaa2297ebf735b61e"} Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.259135 5127 generic.go:334] "Generic (PLEG): container finished" podID="fe1b844d-6fec-4f41-83d3-62fe94a2aa43" containerID="e1a7c551036a87709431fd53fe21f4c8c165139e9e543a5271f4b6cc8e281722" exitCode=0 Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.259456 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-293b-account-create-update-9chp8" event={"ID":"fe1b844d-6fec-4f41-83d3-62fe94a2aa43","Type":"ContainerDied","Data":"e1a7c551036a87709431fd53fe21f4c8c165139e9e543a5271f4b6cc8e281722"} Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.639518 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.770266 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12387291-7208-4df1-b142-486a24065f71-operator-scripts\") pod \"12387291-7208-4df1-b142-486a24065f71\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.771703 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27gph\" (UniqueName: \"kubernetes.io/projected/12387291-7208-4df1-b142-486a24065f71-kube-api-access-27gph\") pod \"12387291-7208-4df1-b142-486a24065f71\" (UID: \"12387291-7208-4df1-b142-486a24065f71\") " Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.772098 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12387291-7208-4df1-b142-486a24065f71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12387291-7208-4df1-b142-486a24065f71" (UID: "12387291-7208-4df1-b142-486a24065f71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.772482 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12387291-7208-4df1-b142-486a24065f71-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.774835 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12387291-7208-4df1-b142-486a24065f71-kube-api-access-27gph" (OuterVolumeSpecName: "kube-api-access-27gph") pod "12387291-7208-4df1-b142-486a24065f71" (UID: "12387291-7208-4df1-b142-486a24065f71"). InnerVolumeSpecName "kube-api-access-27gph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.873758 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27gph\" (UniqueName: \"kubernetes.io/projected/12387291-7208-4df1-b142-486a24065f71-kube-api-access-27gph\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:19 crc kubenswrapper[5127]: I0201 07:08:19.966241 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hqn86" Feb 01 07:08:20 crc kubenswrapper[5127]: I0201 07:08:20.280562 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jr4gn" event={"ID":"12387291-7208-4df1-b142-486a24065f71","Type":"ContainerDied","Data":"69a0bf34b35ce7c891e5cffc3425ea61ad88991643d9990fcc647bd380c00cad"} Feb 01 07:08:20 crc kubenswrapper[5127]: I0201 07:08:20.280631 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69a0bf34b35ce7c891e5cffc3425ea61ad88991643d9990fcc647bd380c00cad" Feb 01 07:08:20 crc kubenswrapper[5127]: I0201 07:08:20.280697 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jr4gn" Feb 01 07:08:20 crc kubenswrapper[5127]: I0201 07:08:20.289563 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"70ea6924342a0f91d794401284c82d8dac971be34d4d31d1be2404903e52efc7"} Feb 01 07:08:20 crc kubenswrapper[5127]: I0201 07:08:20.289638 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"cd0408279605bd61bef597ecbbdac3b1f047aa35e8239141c5d37982ce44fb47"} Feb 01 07:08:20 crc kubenswrapper[5127]: I0201 07:08:20.289653 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"d54876c49569e6c608f8538949b55b1c199573d434261c07be1e7783a323003f"} Feb 01 07:08:20 crc kubenswrapper[5127]: I0201 07:08:20.289664 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"c9704dce7fc0e07cc3a655f4772728e2831f5da440ded50d8d887f0c56f5d13f"} Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.796920 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.825657 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.827088 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.882111 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.883862 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.884155 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.889138 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-operator-scripts\") pod \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.889212 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvpss\" (UniqueName: \"kubernetes.io/projected/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-kube-api-access-cvpss\") pod \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\" (UID: \"fe1b844d-6fec-4f41-83d3-62fe94a2aa43\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.890439 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe1b844d-6fec-4f41-83d3-62fe94a2aa43" (UID: "fe1b844d-6fec-4f41-83d3-62fe94a2aa43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.899164 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-kube-api-access-cvpss" (OuterVolumeSpecName: "kube-api-access-cvpss") pod "fe1b844d-6fec-4f41-83d3-62fe94a2aa43" (UID: "fe1b844d-6fec-4f41-83d3-62fe94a2aa43"). InnerVolumeSpecName "kube-api-access-cvpss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990263 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c64132aa-a148-422a-8d80-b92f9005a34f-operator-scripts\") pod \"c64132aa-a148-422a-8d80-b92f9005a34f\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990337 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run\") pod \"84afae77-cbb7-4363-a302-6aca80f9ceba\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990376 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e86c9ae-529c-41fd-89e6-08de90de4684-operator-scripts\") pod \"6e86c9ae-529c-41fd-89e6-08de90de4684\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990397 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htt7m\" (UniqueName: \"kubernetes.io/projected/c64132aa-a148-422a-8d80-b92f9005a34f-kube-api-access-htt7m\") pod \"c64132aa-a148-422a-8d80-b92f9005a34f\" (UID: \"c64132aa-a148-422a-8d80-b92f9005a34f\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990430 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-additional-scripts\") pod \"84afae77-cbb7-4363-a302-6aca80f9ceba\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990446 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45bf5410-d21e-44a3-b4c7-11fdd25902d0-operator-scripts\") pod \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990461 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2tp5\" (UniqueName: \"kubernetes.io/projected/45bf5410-d21e-44a3-b4c7-11fdd25902d0-kube-api-access-h2tp5\") pod \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\" (UID: \"45bf5410-d21e-44a3-b4c7-11fdd25902d0\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990505 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-scripts\") pod \"84afae77-cbb7-4363-a302-6aca80f9ceba\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990546 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vm24\" (UniqueName: \"kubernetes.io/projected/afdf7282-6160-40d2-b9fa-803bf081e1b6-kube-api-access-9vm24\") pod \"afdf7282-6160-40d2-b9fa-803bf081e1b6\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990567 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afdf7282-6160-40d2-b9fa-803bf081e1b6-operator-scripts\") pod \"afdf7282-6160-40d2-b9fa-803bf081e1b6\" (UID: \"afdf7282-6160-40d2-b9fa-803bf081e1b6\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990596 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97wnv\" (UniqueName: \"kubernetes.io/projected/84afae77-cbb7-4363-a302-6aca80f9ceba-kube-api-access-97wnv\") pod \"84afae77-cbb7-4363-a302-6aca80f9ceba\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990663 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwzfg\" (UniqueName: \"kubernetes.io/projected/6e86c9ae-529c-41fd-89e6-08de90de4684-kube-api-access-pwzfg\") pod \"6e86c9ae-529c-41fd-89e6-08de90de4684\" (UID: \"6e86c9ae-529c-41fd-89e6-08de90de4684\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990682 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-log-ovn\") pod \"84afae77-cbb7-4363-a302-6aca80f9ceba\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.990753 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run-ovn\") pod \"84afae77-cbb7-4363-a302-6aca80f9ceba\" (UID: \"84afae77-cbb7-4363-a302-6aca80f9ceba\") " Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.991045 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.991055 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvpss\" (UniqueName: \"kubernetes.io/projected/fe1b844d-6fec-4f41-83d3-62fe94a2aa43-kube-api-access-cvpss\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.991092 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "84afae77-cbb7-4363-a302-6aca80f9ceba" (UID: "84afae77-cbb7-4363-a302-6aca80f9ceba"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.991511 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c64132aa-a148-422a-8d80-b92f9005a34f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c64132aa-a148-422a-8d80-b92f9005a34f" (UID: "c64132aa-a148-422a-8d80-b92f9005a34f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.991540 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run" (OuterVolumeSpecName: "var-run") pod "84afae77-cbb7-4363-a302-6aca80f9ceba" (UID: "84afae77-cbb7-4363-a302-6aca80f9ceba"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.991870 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e86c9ae-529c-41fd-89e6-08de90de4684-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e86c9ae-529c-41fd-89e6-08de90de4684" (UID: "6e86c9ae-529c-41fd-89e6-08de90de4684"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.993830 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afdf7282-6160-40d2-b9fa-803bf081e1b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afdf7282-6160-40d2-b9fa-803bf081e1b6" (UID: "afdf7282-6160-40d2-b9fa-803bf081e1b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.994034 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45bf5410-d21e-44a3-b4c7-11fdd25902d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45bf5410-d21e-44a3-b4c7-11fdd25902d0" (UID: "45bf5410-d21e-44a3-b4c7-11fdd25902d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.994497 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64132aa-a148-422a-8d80-b92f9005a34f-kube-api-access-htt7m" (OuterVolumeSpecName: "kube-api-access-htt7m") pod "c64132aa-a148-422a-8d80-b92f9005a34f" (UID: "c64132aa-a148-422a-8d80-b92f9005a34f"). InnerVolumeSpecName "kube-api-access-htt7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.994526 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "84afae77-cbb7-4363-a302-6aca80f9ceba" (UID: "84afae77-cbb7-4363-a302-6aca80f9ceba"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.994527 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "84afae77-cbb7-4363-a302-6aca80f9ceba" (UID: "84afae77-cbb7-4363-a302-6aca80f9ceba"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.994563 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afdf7282-6160-40d2-b9fa-803bf081e1b6-kube-api-access-9vm24" (OuterVolumeSpecName: "kube-api-access-9vm24") pod "afdf7282-6160-40d2-b9fa-803bf081e1b6" (UID: "afdf7282-6160-40d2-b9fa-803bf081e1b6"). InnerVolumeSpecName "kube-api-access-9vm24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.996067 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-scripts" (OuterVolumeSpecName: "scripts") pod "84afae77-cbb7-4363-a302-6aca80f9ceba" (UID: "84afae77-cbb7-4363-a302-6aca80f9ceba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.997303 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e86c9ae-529c-41fd-89e6-08de90de4684-kube-api-access-pwzfg" (OuterVolumeSpecName: "kube-api-access-pwzfg") pod "6e86c9ae-529c-41fd-89e6-08de90de4684" (UID: "6e86c9ae-529c-41fd-89e6-08de90de4684"). InnerVolumeSpecName "kube-api-access-pwzfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.998651 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84afae77-cbb7-4363-a302-6aca80f9ceba-kube-api-access-97wnv" (OuterVolumeSpecName: "kube-api-access-97wnv") pod "84afae77-cbb7-4363-a302-6aca80f9ceba" (UID: "84afae77-cbb7-4363-a302-6aca80f9ceba"). InnerVolumeSpecName "kube-api-access-97wnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:25 crc kubenswrapper[5127]: I0201 07:08:25.999337 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45bf5410-d21e-44a3-b4c7-11fdd25902d0-kube-api-access-h2tp5" (OuterVolumeSpecName: "kube-api-access-h2tp5") pod "45bf5410-d21e-44a3-b4c7-11fdd25902d0" (UID: "45bf5410-d21e-44a3-b4c7-11fdd25902d0"). InnerVolumeSpecName "kube-api-access-h2tp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092449 5127 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092483 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c64132aa-a148-422a-8d80-b92f9005a34f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092493 5127 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092501 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e86c9ae-529c-41fd-89e6-08de90de4684-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092510 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htt7m\" (UniqueName: \"kubernetes.io/projected/c64132aa-a148-422a-8d80-b92f9005a34f-kube-api-access-htt7m\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092519 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45bf5410-d21e-44a3-b4c7-11fdd25902d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092527 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2tp5\" (UniqueName: \"kubernetes.io/projected/45bf5410-d21e-44a3-b4c7-11fdd25902d0-kube-api-access-h2tp5\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092535 5127 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092545 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84afae77-cbb7-4363-a302-6aca80f9ceba-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092553 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vm24\" (UniqueName: \"kubernetes.io/projected/afdf7282-6160-40d2-b9fa-803bf081e1b6-kube-api-access-9vm24\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092561 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afdf7282-6160-40d2-b9fa-803bf081e1b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.092569 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97wnv\" (UniqueName: \"kubernetes.io/projected/84afae77-cbb7-4363-a302-6aca80f9ceba-kube-api-access-97wnv\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.093136 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwzfg\" (UniqueName: \"kubernetes.io/projected/6e86c9ae-529c-41fd-89e6-08de90de4684-kube-api-access-pwzfg\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.093150 5127 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/84afae77-cbb7-4363-a302-6aca80f9ceba-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.368217 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-mlw52" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.368365 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86-config-mlw52" event={"ID":"84afae77-cbb7-4363-a302-6aca80f9ceba","Type":"ContainerDied","Data":"3f2d1c096986010095a02bf80cf8c7a795fb54cac9e31256859e196612c95f1c"} Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.369095 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f2d1c096986010095a02bf80cf8c7a795fb54cac9e31256859e196612c95f1c" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.371383 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6px8t" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.371418 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6px8t" event={"ID":"6e86c9ae-529c-41fd-89e6-08de90de4684","Type":"ContainerDied","Data":"9102994da7ecd47aa11bdc8f4f6018c7de7a7d2c5996a5be25aa9fac3b99726f"} Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.371444 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9102994da7ecd47aa11bdc8f4f6018c7de7a7d2c5996a5be25aa9fac3b99726f" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.374492 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v4dp5" event={"ID":"cf149136-6376-4e36-96c8-ed8680852c66","Type":"ContainerStarted","Data":"31960747ee7b4ab9611882d5c02143297f6b9aee515db90ed14f82b4b29d99ba"} Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.381510 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b29e-account-create-update-tsc7k" event={"ID":"45bf5410-d21e-44a3-b4c7-11fdd25902d0","Type":"ContainerDied","Data":"32066304f9830bd73d9a47a561e0eeab2f13f0be954520deb5f1bc8fdcfa114d"} Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.381559 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32066304f9830bd73d9a47a561e0eeab2f13f0be954520deb5f1bc8fdcfa114d" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.381652 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b29e-account-create-update-tsc7k" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.384197 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l2pjk" event={"ID":"c64132aa-a148-422a-8d80-b92f9005a34f","Type":"ContainerDied","Data":"e154237f4b97122d8fff31c4121f6d0ca357df5e856640f9ed9412db59f9f076"} Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.384236 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e154237f4b97122d8fff31c4121f6d0ca357df5e856640f9ed9412db59f9f076" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.384304 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l2pjk" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.394175 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c35b-account-create-update-fclsw" event={"ID":"afdf7282-6160-40d2-b9fa-803bf081e1b6","Type":"ContainerDied","Data":"3bfb2b70e5f792abca8b3b4d88406da4c3a03a9b12f1b2ae492bac509ff2d34e"} Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.394208 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bfb2b70e5f792abca8b3b4d88406da4c3a03a9b12f1b2ae492bac509ff2d34e" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.394271 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c35b-account-create-update-fclsw" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.401482 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-v4dp5" podStartSLOduration=3.815673379 podStartE2EDuration="11.401465884s" podCreationTimestamp="2026-02-01 07:08:15 +0000 UTC" firstStartedPulling="2026-02-01 07:08:18.120533239 +0000 UTC m=+1248.606435602" lastFinishedPulling="2026-02-01 07:08:25.706325744 +0000 UTC m=+1256.192228107" observedRunningTime="2026-02-01 07:08:26.396897961 +0000 UTC m=+1256.882800324" watchObservedRunningTime="2026-02-01 07:08:26.401465884 +0000 UTC m=+1256.887368247" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.401525 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-293b-account-create-update-9chp8" event={"ID":"fe1b844d-6fec-4f41-83d3-62fe94a2aa43","Type":"ContainerDied","Data":"fac5f09a331a6834d64fd24562297757fbe2e4037fadc6009cc79a87241470c9"} Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.401594 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fac5f09a331a6834d64fd24562297757fbe2e4037fadc6009cc79a87241470c9" Feb 01 07:08:26 crc kubenswrapper[5127]: I0201 07:08:26.401675 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-293b-account-create-update-9chp8" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.064258 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hqn86-config-mlw52"] Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.083636 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hqn86-config-mlw52"] Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166079 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hqn86-config-hwwv5"] Feb 01 07:08:27 crc kubenswrapper[5127]: E0201 07:08:27.166422 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84afae77-cbb7-4363-a302-6aca80f9ceba" containerName="ovn-config" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166439 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="84afae77-cbb7-4363-a302-6aca80f9ceba" containerName="ovn-config" Feb 01 07:08:27 crc kubenswrapper[5127]: E0201 07:08:27.166457 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45bf5410-d21e-44a3-b4c7-11fdd25902d0" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166463 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="45bf5410-d21e-44a3-b4c7-11fdd25902d0" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: E0201 07:08:27.166480 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afdf7282-6160-40d2-b9fa-803bf081e1b6" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166486 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="afdf7282-6160-40d2-b9fa-803bf081e1b6" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: E0201 07:08:27.166502 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64132aa-a148-422a-8d80-b92f9005a34f" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166509 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64132aa-a148-422a-8d80-b92f9005a34f" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: E0201 07:08:27.166517 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe1b844d-6fec-4f41-83d3-62fe94a2aa43" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166524 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe1b844d-6fec-4f41-83d3-62fe94a2aa43" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: E0201 07:08:27.166538 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12387291-7208-4df1-b142-486a24065f71" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166544 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="12387291-7208-4df1-b142-486a24065f71" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: E0201 07:08:27.166553 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e86c9ae-529c-41fd-89e6-08de90de4684" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166559 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e86c9ae-529c-41fd-89e6-08de90de4684" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166701 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="45bf5410-d21e-44a3-b4c7-11fdd25902d0" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166717 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe1b844d-6fec-4f41-83d3-62fe94a2aa43" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166729 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64132aa-a148-422a-8d80-b92f9005a34f" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166735 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="12387291-7208-4df1-b142-486a24065f71" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166744 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e86c9ae-529c-41fd-89e6-08de90de4684" containerName="mariadb-database-create" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166752 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="84afae77-cbb7-4363-a302-6aca80f9ceba" containerName="ovn-config" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.166760 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="afdf7282-6160-40d2-b9fa-803bf081e1b6" containerName="mariadb-account-create-update" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.167229 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.172285 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.180598 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hqn86-config-hwwv5"] Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.309495 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.309558 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run-ovn\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.309639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-log-ovn\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.309673 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-scripts\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.309704 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4x7c\" (UniqueName: \"kubernetes.io/projected/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-kube-api-access-s4x7c\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.309745 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-additional-scripts\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.412064 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.412491 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run-ovn\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.412521 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-log-ovn\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.412395 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.416526 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run-ovn\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.416539 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-scripts\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.416958 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4x7c\" (UniqueName: \"kubernetes.io/projected/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-kube-api-access-s4x7c\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.416633 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-log-ovn\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.417134 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-additional-scripts\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.418877 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-additional-scripts\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.420070 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-scripts\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.429566 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"914b8bb69bc3bfe2d7935699ef76aca574042432793c4d5754b940ebe207865b"} Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.429629 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"d37333ecd6017a5cdc098711dfbdfa4e7ddb88dafd4fb0421fa3c8183a90db30"} Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.429645 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"761c57fdee0d6b1288274f98290bb8cd974e5bc157c50992d2820212429734cd"} Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.437234 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4x7c\" (UniqueName: \"kubernetes.io/projected/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-kube-api-access-s4x7c\") pod \"ovn-controller-hqn86-config-hwwv5\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:27 crc kubenswrapper[5127]: I0201 07:08:27.522386 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:28 crc kubenswrapper[5127]: I0201 07:08:28.227554 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hqn86-config-hwwv5"] Feb 01 07:08:28 crc kubenswrapper[5127]: W0201 07:08:28.230657 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2aa22a35_ebcb_42f9_83cf_9b9fbcdaa773.slice/crio-c2145473372cda51d113d764f92d1566068c09c8d589891559b98906a9ba457e WatchSource:0}: Error finding container c2145473372cda51d113d764f92d1566068c09c8d589891559b98906a9ba457e: Status 404 returned error can't find the container with id c2145473372cda51d113d764f92d1566068c09c8d589891559b98906a9ba457e Feb 01 07:08:28 crc kubenswrapper[5127]: I0201 07:08:28.251199 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84afae77-cbb7-4363-a302-6aca80f9ceba" path="/var/lib/kubelet/pods/84afae77-cbb7-4363-a302-6aca80f9ceba/volumes" Feb 01 07:08:28 crc kubenswrapper[5127]: I0201 07:08:28.438435 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86-config-hwwv5" event={"ID":"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773","Type":"ContainerStarted","Data":"c2145473372cda51d113d764f92d1566068c09c8d589891559b98906a9ba457e"} Feb 01 07:08:28 crc kubenswrapper[5127]: I0201 07:08:28.446891 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"005c8c714d8be3311a798fc93522b27e5504130f9c1fa418f83c1ab86906035c"} Feb 01 07:08:28 crc kubenswrapper[5127]: I0201 07:08:28.446936 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"5653ed02c5b90531b86d9ac767b79937dac0b76281e108a3a937c34943529698"} Feb 01 07:08:28 crc kubenswrapper[5127]: I0201 07:08:28.446949 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"95be98dc047c279bbce09d7aa189270919803433cfe5dc74d1073beb651e9b25"} Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.463683 5127 generic.go:334] "Generic (PLEG): container finished" podID="dc5df89b-8911-4464-8b7f-c9716a7243ea" containerID="affbafd2f2595fb216b991e56f47604b5bf279799e0e1cdba3730599a76111ab" exitCode=0 Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.463774 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4qmcj" event={"ID":"dc5df89b-8911-4464-8b7f-c9716a7243ea","Type":"ContainerDied","Data":"affbafd2f2595fb216b991e56f47604b5bf279799e0e1cdba3730599a76111ab"} Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.481017 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerStarted","Data":"2346499dd9e7c21de3823593069c8520d97c16bb6dede126e55ac71fc4a085b0"} Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.484885 5127 generic.go:334] "Generic (PLEG): container finished" podID="2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" containerID="e60c44613d67c39a8a8a24961d2a3544213836e4d6601012971fcd1537cbd5f1" exitCode=0 Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.484976 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86-config-hwwv5" event={"ID":"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773","Type":"ContainerDied","Data":"e60c44613d67c39a8a8a24961d2a3544213836e4d6601012971fcd1537cbd5f1"} Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.600331 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.86492268 podStartE2EDuration="39.600258995s" podCreationTimestamp="2026-02-01 07:07:50 +0000 UTC" firstStartedPulling="2026-02-01 07:08:08.805869781 +0000 UTC m=+1239.291772144" lastFinishedPulling="2026-02-01 07:08:26.541206096 +0000 UTC m=+1257.027108459" observedRunningTime="2026-02-01 07:08:29.586061661 +0000 UTC m=+1260.071964064" watchObservedRunningTime="2026-02-01 07:08:29.600258995 +0000 UTC m=+1260.086161398" Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.941304 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-sthmg"] Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.945151 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.950959 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 01 07:08:29 crc kubenswrapper[5127]: I0201 07:08:29.961799 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-sthmg"] Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.075389 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.075457 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.075487 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.075808 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjgb6\" (UniqueName: \"kubernetes.io/projected/cf72a7f8-1033-45bc-9d5d-84473b29d28f-kube-api-access-xjgb6\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.075917 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.075944 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-config\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.177349 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjgb6\" (UniqueName: \"kubernetes.io/projected/cf72a7f8-1033-45bc-9d5d-84473b29d28f-kube-api-access-xjgb6\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.177465 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.177486 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-config\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.177515 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.177552 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.177593 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.178508 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.178705 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.178794 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-config\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.178807 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.179495 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.190175 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.196050 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjgb6\" (UniqueName: \"kubernetes.io/projected/cf72a7f8-1033-45bc-9d5d-84473b29d28f-kube-api-access-xjgb6\") pod \"dnsmasq-dns-8467b54bcc-sthmg\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.273293 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.495906 5127 generic.go:334] "Generic (PLEG): container finished" podID="cf149136-6376-4e36-96c8-ed8680852c66" containerID="31960747ee7b4ab9611882d5c02143297f6b9aee515db90ed14f82b4b29d99ba" exitCode=0 Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.496004 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v4dp5" event={"ID":"cf149136-6376-4e36-96c8-ed8680852c66","Type":"ContainerDied","Data":"31960747ee7b4ab9611882d5c02143297f6b9aee515db90ed14f82b4b29d99ba"} Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.735724 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-sthmg"] Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.826075 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.880310 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.898657 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-additional-scripts\") pod \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.899831 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4x7c\" (UniqueName: \"kubernetes.io/projected/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-kube-api-access-s4x7c\") pod \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.899941 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-log-ovn\") pod \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.900091 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run\") pod \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.899765 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" (UID: "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.900260 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" (UID: "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.900350 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run-ovn\") pod \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.900494 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-scripts\") pod \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\" (UID: \"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773\") " Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.900356 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run" (OuterVolumeSpecName: "var-run") pod "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" (UID: "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.900431 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" (UID: "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.901039 5127 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.901099 5127 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.901146 5127 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.901194 5127 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.901494 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-scripts" (OuterVolumeSpecName: "scripts") pod "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" (UID: "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:30 crc kubenswrapper[5127]: I0201 07:08:30.905363 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-kube-api-access-s4x7c" (OuterVolumeSpecName: "kube-api-access-s4x7c") pod "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" (UID: "2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773"). InnerVolumeSpecName "kube-api-access-s4x7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.002417 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc64l\" (UniqueName: \"kubernetes.io/projected/dc5df89b-8911-4464-8b7f-c9716a7243ea-kube-api-access-vc64l\") pod \"dc5df89b-8911-4464-8b7f-c9716a7243ea\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.002458 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-config-data\") pod \"dc5df89b-8911-4464-8b7f-c9716a7243ea\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.002597 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-combined-ca-bundle\") pod \"dc5df89b-8911-4464-8b7f-c9716a7243ea\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.002634 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-db-sync-config-data\") pod \"dc5df89b-8911-4464-8b7f-c9716a7243ea\" (UID: \"dc5df89b-8911-4464-8b7f-c9716a7243ea\") " Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.003024 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4x7c\" (UniqueName: \"kubernetes.io/projected/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-kube-api-access-s4x7c\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.003043 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.006813 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5df89b-8911-4464-8b7f-c9716a7243ea-kube-api-access-vc64l" (OuterVolumeSpecName: "kube-api-access-vc64l") pod "dc5df89b-8911-4464-8b7f-c9716a7243ea" (UID: "dc5df89b-8911-4464-8b7f-c9716a7243ea"). InnerVolumeSpecName "kube-api-access-vc64l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.008398 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dc5df89b-8911-4464-8b7f-c9716a7243ea" (UID: "dc5df89b-8911-4464-8b7f-c9716a7243ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.022789 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc5df89b-8911-4464-8b7f-c9716a7243ea" (UID: "dc5df89b-8911-4464-8b7f-c9716a7243ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.058548 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-config-data" (OuterVolumeSpecName: "config-data") pod "dc5df89b-8911-4464-8b7f-c9716a7243ea" (UID: "dc5df89b-8911-4464-8b7f-c9716a7243ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.104848 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.104892 5127 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.104902 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc64l\" (UniqueName: \"kubernetes.io/projected/dc5df89b-8911-4464-8b7f-c9716a7243ea-kube-api-access-vc64l\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.104916 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5df89b-8911-4464-8b7f-c9716a7243ea-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.509962 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86-config-hwwv5" event={"ID":"2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773","Type":"ContainerDied","Data":"c2145473372cda51d113d764f92d1566068c09c8d589891559b98906a9ba457e"} Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.510019 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2145473372cda51d113d764f92d1566068c09c8d589891559b98906a9ba457e" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.510037 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86-config-hwwv5" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.514254 5127 generic.go:334] "Generic (PLEG): container finished" podID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerID="484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07" exitCode=0 Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.514352 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" event={"ID":"cf72a7f8-1033-45bc-9d5d-84473b29d28f","Type":"ContainerDied","Data":"484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07"} Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.514428 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" event={"ID":"cf72a7f8-1033-45bc-9d5d-84473b29d28f","Type":"ContainerStarted","Data":"52f522cffb4ad659bf0c46b00cccf24d16cb72e6a204e2176aa1fa71c0976855"} Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.517163 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-4qmcj" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.518727 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-4qmcj" event={"ID":"dc5df89b-8911-4464-8b7f-c9716a7243ea","Type":"ContainerDied","Data":"41affb1684453fd055a42bb1f6f18efe33967c96ac76ea1bfb1615f64cc276a6"} Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.518783 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41affb1684453fd055a42bb1f6f18efe33967c96ac76ea1bfb1615f64cc276a6" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.855376 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-sthmg"] Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.916178 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-45g9w"] Feb 01 07:08:31 crc kubenswrapper[5127]: E0201 07:08:31.916509 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" containerName="ovn-config" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.916520 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" containerName="ovn-config" Feb 01 07:08:31 crc kubenswrapper[5127]: E0201 07:08:31.916540 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5df89b-8911-4464-8b7f-c9716a7243ea" containerName="glance-db-sync" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.916546 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5df89b-8911-4464-8b7f-c9716a7243ea" containerName="glance-db-sync" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.916712 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" containerName="ovn-config" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.916725 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5df89b-8911-4464-8b7f-c9716a7243ea" containerName="glance-db-sync" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.917494 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.934823 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-45g9w"] Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.952728 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hqn86-config-hwwv5"] Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.979442 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hqn86-config-hwwv5"] Feb 01 07:08:31 crc kubenswrapper[5127]: I0201 07:08:31.983449 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.024816 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-config\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.024859 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.024890 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.024939 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.024968 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.024991 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5zjb\" (UniqueName: \"kubernetes.io/projected/626d256d-3b31-4da0-9adf-6b1c11e8330d-kube-api-access-f5zjb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126303 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-config-data\") pod \"cf149136-6376-4e36-96c8-ed8680852c66\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126464 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdz24\" (UniqueName: \"kubernetes.io/projected/cf149136-6376-4e36-96c8-ed8680852c66-kube-api-access-wdz24\") pod \"cf149136-6376-4e36-96c8-ed8680852c66\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126518 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-combined-ca-bundle\") pod \"cf149136-6376-4e36-96c8-ed8680852c66\" (UID: \"cf149136-6376-4e36-96c8-ed8680852c66\") " Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126827 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-config\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126862 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126893 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126938 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126966 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.126988 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5zjb\" (UniqueName: \"kubernetes.io/projected/626d256d-3b31-4da0-9adf-6b1c11e8330d-kube-api-access-f5zjb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.128828 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-config\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.129058 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-sb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.129340 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-svc\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.129495 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-swift-storage-0\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.129631 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-nb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.137746 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf149136-6376-4e36-96c8-ed8680852c66-kube-api-access-wdz24" (OuterVolumeSpecName: "kube-api-access-wdz24") pod "cf149136-6376-4e36-96c8-ed8680852c66" (UID: "cf149136-6376-4e36-96c8-ed8680852c66"). InnerVolumeSpecName "kube-api-access-wdz24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.143318 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5zjb\" (UniqueName: \"kubernetes.io/projected/626d256d-3b31-4da0-9adf-6b1c11e8330d-kube-api-access-f5zjb\") pod \"dnsmasq-dns-56c9bc6f5c-45g9w\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.149255 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf149136-6376-4e36-96c8-ed8680852c66" (UID: "cf149136-6376-4e36-96c8-ed8680852c66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.170361 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-config-data" (OuterVolumeSpecName: "config-data") pod "cf149136-6376-4e36-96c8-ed8680852c66" (UID: "cf149136-6376-4e36-96c8-ed8680852c66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.228742 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.228876 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdz24\" (UniqueName: \"kubernetes.io/projected/cf149136-6376-4e36-96c8-ed8680852c66-kube-api-access-wdz24\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.228954 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf149136-6376-4e36-96c8-ed8680852c66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.247052 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773" path="/var/lib/kubelet/pods/2aa22a35-ebcb-42f9-83cf-9b9fbcdaa773/volumes" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.259433 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.530168 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-v4dp5" event={"ID":"cf149136-6376-4e36-96c8-ed8680852c66","Type":"ContainerDied","Data":"b35b2d8b740ef12019cc6edda52cb6cd53a2f7d7fb856f34719af4964959f9fd"} Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.530442 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b35b2d8b740ef12019cc6edda52cb6cd53a2f7d7fb856f34719af4964959f9fd" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.530498 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-v4dp5" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.532930 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" event={"ID":"cf72a7f8-1033-45bc-9d5d-84473b29d28f","Type":"ContainerStarted","Data":"9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa"} Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.533061 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" podUID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerName="dnsmasq-dns" containerID="cri-o://9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa" gracePeriod=10 Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.533362 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.564983 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" podStartSLOduration=3.564963021 podStartE2EDuration="3.564963021s" podCreationTimestamp="2026-02-01 07:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:32.564220381 +0000 UTC m=+1263.050122744" watchObservedRunningTime="2026-02-01 07:08:32.564963021 +0000 UTC m=+1263.050865394" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.708175 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-45g9w"] Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.750181 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xxjcv"] Feb 01 07:08:32 crc kubenswrapper[5127]: E0201 07:08:32.750746 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf149136-6376-4e36-96c8-ed8680852c66" containerName="keystone-db-sync" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.750768 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf149136-6376-4e36-96c8-ed8680852c66" containerName="keystone-db-sync" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.751037 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf149136-6376-4e36-96c8-ed8680852c66" containerName="keystone-db-sync" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.751793 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.753965 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.754278 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.754384 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.754484 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.759448 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7vjtc" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.769630 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-n86ld"] Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.773698 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.785642 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xxjcv"] Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.806685 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-n86ld"] Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.821442 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-45g9w"] Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.843921 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-config-data\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.843959 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.843992 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-fernet-keys\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844009 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-combined-ca-bundle\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844060 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-config\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844082 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844135 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kgl5\" (UniqueName: \"kubernetes.io/projected/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-kube-api-access-5kgl5\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844172 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l8qd\" (UniqueName: \"kubernetes.io/projected/3e8257dc-a94e-4c4a-987f-8329c2201f78-kube-api-access-9l8qd\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844194 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-credential-keys\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844217 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844246 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-scripts\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.844261 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.942391 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-w7586"] Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.945453 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.946790 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-config\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.946838 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.946890 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kgl5\" (UniqueName: \"kubernetes.io/projected/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-kube-api-access-5kgl5\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.946926 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l8qd\" (UniqueName: \"kubernetes.io/projected/3e8257dc-a94e-4c4a-987f-8329c2201f78-kube-api-access-9l8qd\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.946953 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-credential-keys\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.946973 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.947002 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-scripts\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.947019 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.947041 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-config-data\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.947057 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.947083 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-fernet-keys\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.947098 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-combined-ca-bundle\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.947929 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-svc\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.948501 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-config\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.949107 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.951487 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.953119 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-swift-storage-0\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.964814 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-fernet-keys\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.965432 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.965544 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.965889 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8l8d5" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.976182 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-combined-ca-bundle\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.983128 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l8qd\" (UniqueName: \"kubernetes.io/projected/3e8257dc-a94e-4c4a-987f-8329c2201f78-kube-api-access-9l8qd\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.986537 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-scripts\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.988376 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kgl5\" (UniqueName: \"kubernetes.io/projected/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-kube-api-access-5kgl5\") pod \"dnsmasq-dns-54b4bb76d5-n86ld\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:32 crc kubenswrapper[5127]: I0201 07:08:32.992091 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-credential-keys\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.002614 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-config-data\") pod \"keystone-bootstrap-xxjcv\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.007207 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-w7586"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.015528 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.019675 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.025549 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.025804 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.038744 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-z9xf9"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.039834 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.043925 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.044105 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.043995 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2pfbp" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.051559 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sjg2\" (UniqueName: \"kubernetes.io/projected/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-kube-api-access-5sjg2\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.051651 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-etc-machine-id\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.051684 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-config-data\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.051732 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-scripts\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.051779 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-combined-ca-bundle\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.051813 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-db-sync-config-data\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.061751 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.075963 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.098913 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z9xf9"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.134655 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-mxrjb"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.135808 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.177026 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.178770 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-config\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.178822 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-run-httpd\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.178846 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-combined-ca-bundle\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179050 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-combined-ca-bundle\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179102 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179188 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g54fr\" (UniqueName: \"kubernetes.io/projected/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-kube-api-access-g54fr\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179250 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-db-sync-config-data\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179304 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-scripts\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179351 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sjg2\" (UniqueName: \"kubernetes.io/projected/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-kube-api-access-5sjg2\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179388 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9hk9\" (UniqueName: \"kubernetes.io/projected/de51194a-4317-47c7-a5a8-cb81905825f2-kube-api-access-w9hk9\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179446 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179529 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-etc-machine-id\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179626 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-config-data\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179702 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-config-data\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179743 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-log-httpd\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.179802 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-scripts\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.182036 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.182257 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zl5lx" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.182571 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-etc-machine-id\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.205764 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ssck9"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.211654 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.228520 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qqq8k" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.233038 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.234938 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.239864 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sjg2\" (UniqueName: \"kubernetes.io/projected/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-kube-api-access-5sjg2\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.286052 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-config\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.287450 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-config-data\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.287650 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-run-httpd\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.287745 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-combined-ca-bundle\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.287853 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-combined-ca-bundle\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.287960 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx6jc\" (UniqueName: \"kubernetes.io/projected/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-kube-api-access-zx6jc\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.288092 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.288192 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-combined-ca-bundle\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.288277 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-scripts\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.288368 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g54fr\" (UniqueName: \"kubernetes.io/projected/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-kube-api-access-g54fr\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.288453 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-db-sync-config-data\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.288568 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-scripts\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.288683 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9hk9\" (UniqueName: \"kubernetes.io/projected/de51194a-4317-47c7-a5a8-cb81905825f2-kube-api-access-w9hk9\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.290498 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.291808 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e206ee74-3e4a-48d2-b7d1-af07cd542f72-logs\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.292102 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq7sl\" (UniqueName: \"kubernetes.io/projected/e206ee74-3e4a-48d2-b7d1-af07cd542f72-kube-api-access-mq7sl\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.292209 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-config-data\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.292315 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-log-httpd\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.292800 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-log-httpd\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.294368 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-run-httpd\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.301437 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-config-data\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.302490 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-scripts\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.323382 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-combined-ca-bundle\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.323940 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-combined-ca-bundle\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.324316 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-scripts\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.324920 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-config\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.337421 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-db-sync-config-data\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.340364 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mxrjb"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.344061 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g54fr\" (UniqueName: \"kubernetes.io/projected/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-kube-api-access-g54fr\") pod \"neutron-db-sync-z9xf9\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.351924 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-config-data\") pod \"cinder-db-sync-w7586\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.352180 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9hk9\" (UniqueName: \"kubernetes.io/projected/de51194a-4317-47c7-a5a8-cb81905825f2-kube-api-access-w9hk9\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.356388 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397063 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e206ee74-3e4a-48d2-b7d1-af07cd542f72-logs\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397189 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq7sl\" (UniqueName: \"kubernetes.io/projected/e206ee74-3e4a-48d2-b7d1-af07cd542f72-kube-api-access-mq7sl\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397603 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-config-data\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397647 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-combined-ca-bundle\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397666 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx6jc\" (UniqueName: \"kubernetes.io/projected/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-kube-api-access-zx6jc\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397727 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-combined-ca-bundle\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397747 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-scripts\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.397785 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-db-sync-config-data\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.400211 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e206ee74-3e4a-48d2-b7d1-af07cd542f72-logs\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.403212 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-scripts\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.408278 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-combined-ca-bundle\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.412564 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-config-data\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.413260 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-db-sync-config-data\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.414167 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.422192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-combined-ca-bundle\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.430976 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ssck9"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.460442 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq7sl\" (UniqueName: \"kubernetes.io/projected/e206ee74-3e4a-48d2-b7d1-af07cd542f72-kube-api-access-mq7sl\") pod \"placement-db-sync-ssck9\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.462980 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx6jc\" (UniqueName: \"kubernetes.io/projected/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-kube-api-access-zx6jc\") pod \"barbican-db-sync-mxrjb\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.481640 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-n86ld"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.484483 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.492114 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-ljm24"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.493917 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.494948 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.511484 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssck9" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.519370 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-ljm24"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.543030 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.543872 5127 generic.go:334] "Generic (PLEG): container finished" podID="626d256d-3b31-4da0-9adf-6b1c11e8330d" containerID="dd161ee891128b5db5505282c5d987d5b4738d7452fa81f4186041b020cc6cfd" exitCode=0 Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.543918 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" event={"ID":"626d256d-3b31-4da0-9adf-6b1c11e8330d","Type":"ContainerDied","Data":"dd161ee891128b5db5505282c5d987d5b4738d7452fa81f4186041b020cc6cfd"} Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.543941 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" event={"ID":"626d256d-3b31-4da0-9adf-6b1c11e8330d","Type":"ContainerStarted","Data":"dfd1b5ed2296fa9f199abbcff96abbd154ffe17ce80385a0145e1ae5adf01d27"} Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.587833 5127 generic.go:334] "Generic (PLEG): container finished" podID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerID="9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa" exitCode=0 Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.587871 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" event={"ID":"cf72a7f8-1033-45bc-9d5d-84473b29d28f","Type":"ContainerDied","Data":"9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa"} Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.587989 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.588013 5127 scope.go:117] "RemoveContainer" containerID="9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.587906 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-sthmg" event={"ID":"cf72a7f8-1033-45bc-9d5d-84473b29d28f","Type":"ContainerDied","Data":"52f522cffb4ad659bf0c46b00cccf24d16cb72e6a204e2176aa1fa71c0976855"} Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.596383 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-w7586" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.600840 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.600887 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.600918 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-config\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.600961 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.601012 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.601099 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrhmj\" (UniqueName: \"kubernetes.io/projected/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-kube-api-access-qrhmj\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.695779 5127 scope.go:117] "RemoveContainer" containerID="484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.701962 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-swift-storage-0\") pod \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702046 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-sb\") pod \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702148 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-config\") pod \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702182 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-svc\") pod \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702281 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-nb\") pod \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702358 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjgb6\" (UniqueName: \"kubernetes.io/projected/cf72a7f8-1033-45bc-9d5d-84473b29d28f-kube-api-access-xjgb6\") pod \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\" (UID: \"cf72a7f8-1033-45bc-9d5d-84473b29d28f\") " Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702632 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702767 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrhmj\" (UniqueName: \"kubernetes.io/projected/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-kube-api-access-qrhmj\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702838 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.702879 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.703348 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-config\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.703396 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.704925 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.708370 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.709089 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.709473 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-config\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.709768 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.710223 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.710832 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf72a7f8-1033-45bc-9d5d-84473b29d28f-kube-api-access-xjgb6" (OuterVolumeSpecName: "kube-api-access-xjgb6") pod "cf72a7f8-1033-45bc-9d5d-84473b29d28f" (UID: "cf72a7f8-1033-45bc-9d5d-84473b29d28f"). InnerVolumeSpecName "kube-api-access-xjgb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.729860 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrhmj\" (UniqueName: \"kubernetes.io/projected/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-kube-api-access-qrhmj\") pod \"dnsmasq-dns-5dc4fcdbc-ljm24\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.756950 5127 scope.go:117] "RemoveContainer" containerID="9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa" Feb 01 07:08:33 crc kubenswrapper[5127]: E0201 07:08:33.758006 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa\": container with ID starting with 9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa not found: ID does not exist" containerID="9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.758040 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa"} err="failed to get container status \"9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa\": rpc error: code = NotFound desc = could not find container \"9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa\": container with ID starting with 9fea742755b5b465f632b96c2ace0673a18c1c0c3cfb07fa6293368943ca37fa not found: ID does not exist" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.758672 5127 scope.go:117] "RemoveContainer" containerID="484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07" Feb 01 07:08:33 crc kubenswrapper[5127]: E0201 07:08:33.759009 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07\": container with ID starting with 484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07 not found: ID does not exist" containerID="484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.759041 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07"} err="failed to get container status \"484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07\": rpc error: code = NotFound desc = could not find container \"484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07\": container with ID starting with 484e048ad936aaf79e54efa1f091e5d05371cadc219221e7439eb8892f523d07 not found: ID does not exist" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.762905 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf72a7f8-1033-45bc-9d5d-84473b29d28f" (UID: "cf72a7f8-1033-45bc-9d5d-84473b29d28f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.773850 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf72a7f8-1033-45bc-9d5d-84473b29d28f" (UID: "cf72a7f8-1033-45bc-9d5d-84473b29d28f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.773952 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-config" (OuterVolumeSpecName: "config") pod "cf72a7f8-1033-45bc-9d5d-84473b29d28f" (UID: "cf72a7f8-1033-45bc-9d5d-84473b29d28f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.779086 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf72a7f8-1033-45bc-9d5d-84473b29d28f" (UID: "cf72a7f8-1033-45bc-9d5d-84473b29d28f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.796856 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf72a7f8-1033-45bc-9d5d-84473b29d28f" (UID: "cf72a7f8-1033-45bc-9d5d-84473b29d28f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.804517 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjgb6\" (UniqueName: \"kubernetes.io/projected/cf72a7f8-1033-45bc-9d5d-84473b29d28f-kube-api-access-xjgb6\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.804782 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.804791 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.804803 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.804814 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.804822 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf72a7f8-1033-45bc-9d5d-84473b29d28f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.872193 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:33 crc kubenswrapper[5127]: E0201 07:08:33.872519 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerName="init" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.872534 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerName="init" Feb 01 07:08:33 crc kubenswrapper[5127]: E0201 07:08:33.872544 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerName="dnsmasq-dns" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.872551 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerName="dnsmasq-dns" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.872711 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" containerName="dnsmasq-dns" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.873543 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.877023 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.877366 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.877605 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-lkmxs" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.892047 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.900837 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:33 crc kubenswrapper[5127]: W0201 07:08:33.946052 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e8257dc_a94e_4c4a_987f_8329c2201f78.slice/crio-d232e26b2ced06affc9bddb2352385b10042bca1db65a4a7067795ab53c01833 WatchSource:0}: Error finding container d232e26b2ced06affc9bddb2352385b10042bca1db65a4a7067795ab53c01833: Status 404 returned error can't find the container with id d232e26b2ced06affc9bddb2352385b10042bca1db65a4a7067795ab53c01833 Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.947100 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-n86ld"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.958518 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xxjcv"] Feb 01 07:08:33 crc kubenswrapper[5127]: I0201 07:08:33.980186 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-sthmg"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.013422 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.013491 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-logs\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.013522 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9v6k\" (UniqueName: \"kubernetes.io/projected/ff74f154-d1ce-44a2-9575-392f0fdf8862-kube-api-access-n9v6k\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.013555 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.013603 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.013675 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.013706 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.014651 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-sthmg"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.034824 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.035148 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:34 crc kubenswrapper[5127]: E0201 07:08:34.035464 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626d256d-3b31-4da0-9adf-6b1c11e8330d" containerName="init" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.035475 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="626d256d-3b31-4da0-9adf-6b1c11e8330d" containerName="init" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.035742 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="626d256d-3b31-4da0-9adf-6b1c11e8330d" containerName="init" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.037863 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.039649 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.053672 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.102804 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z9xf9"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.114836 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-swift-storage-0\") pod \"626d256d-3b31-4da0-9adf-6b1c11e8330d\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.114911 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-svc\") pod \"626d256d-3b31-4da0-9adf-6b1c11e8330d\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.114965 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-config\") pod \"626d256d-3b31-4da0-9adf-6b1c11e8330d\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115043 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5zjb\" (UniqueName: \"kubernetes.io/projected/626d256d-3b31-4da0-9adf-6b1c11e8330d-kube-api-access-f5zjb\") pod \"626d256d-3b31-4da0-9adf-6b1c11e8330d\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115118 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-nb\") pod \"626d256d-3b31-4da0-9adf-6b1c11e8330d\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115189 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-sb\") pod \"626d256d-3b31-4da0-9adf-6b1c11e8330d\" (UID: \"626d256d-3b31-4da0-9adf-6b1c11e8330d\") " Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115476 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9v6k\" (UniqueName: \"kubernetes.io/projected/ff74f154-d1ce-44a2-9575-392f0fdf8862-kube-api-access-n9v6k\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115510 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115556 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115593 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115610 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115659 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115677 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115713 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115747 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115771 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vslsx\" (UniqueName: \"kubernetes.io/projected/e10b34bf-efdd-4cf1-b9c7-a46d199924df-kube-api-access-vslsx\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115789 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115829 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-logs\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115858 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-logs\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.115882 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.118574 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.120182 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.120405 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-logs\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.127436 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.127909 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.129029 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.129784 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ssck9"] Feb 01 07:08:34 crc kubenswrapper[5127]: W0201 07:08:34.133053 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d62e96f_7e79_4c05_8c2e_2656ef444f4a.slice/crio-a698a3a841868d4df8086230dfdd0e708eef55c53f09d7f8be451c0a57b04b71 WatchSource:0}: Error finding container a698a3a841868d4df8086230dfdd0e708eef55c53f09d7f8be451c0a57b04b71: Status 404 returned error can't find the container with id a698a3a841868d4df8086230dfdd0e708eef55c53f09d7f8be451c0a57b04b71 Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.133151 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626d256d-3b31-4da0-9adf-6b1c11e8330d-kube-api-access-f5zjb" (OuterVolumeSpecName: "kube-api-access-f5zjb") pod "626d256d-3b31-4da0-9adf-6b1c11e8330d" (UID: "626d256d-3b31-4da0-9adf-6b1c11e8330d"). InnerVolumeSpecName "kube-api-access-f5zjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.137729 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-mxrjb"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.177505 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9v6k\" (UniqueName: \"kubernetes.io/projected/ff74f154-d1ce-44a2-9575-392f0fdf8862-kube-api-access-n9v6k\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.186649 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "626d256d-3b31-4da0-9adf-6b1c11e8330d" (UID: "626d256d-3b31-4da0-9adf-6b1c11e8330d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217380 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217452 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217472 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217516 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217537 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217612 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vslsx\" (UniqueName: \"kubernetes.io/projected/e10b34bf-efdd-4cf1-b9c7-a46d199924df-kube-api-access-vslsx\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217652 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-logs\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.217649 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.219805 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.220213 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-logs\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.220284 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5zjb\" (UniqueName: \"kubernetes.io/projected/626d256d-3b31-4da0-9adf-6b1c11e8330d-kube-api-access-f5zjb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.220302 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.227700 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.227745 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.240307 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.242864 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vslsx\" (UniqueName: \"kubernetes.io/projected/e10b34bf-efdd-4cf1-b9c7-a46d199924df-kube-api-access-vslsx\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.247152 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf72a7f8-1033-45bc-9d5d-84473b29d28f" path="/var/lib/kubelet/pods/cf72a7f8-1033-45bc-9d5d-84473b29d28f/volumes" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.293782 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.297726 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "626d256d-3b31-4da0-9adf-6b1c11e8330d" (UID: "626d256d-3b31-4da0-9adf-6b1c11e8330d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.323392 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.380615 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.403273 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-config" (OuterVolumeSpecName: "config") pod "626d256d-3b31-4da0-9adf-6b1c11e8330d" (UID: "626d256d-3b31-4da0-9adf-6b1c11e8330d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.403307 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "626d256d-3b31-4da0-9adf-6b1c11e8330d" (UID: "626d256d-3b31-4da0-9adf-6b1c11e8330d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.413501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "626d256d-3b31-4da0-9adf-6b1c11e8330d" (UID: "626d256d-3b31-4da0-9adf-6b1c11e8330d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.424925 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.424964 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.424975 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/626d256d-3b31-4da0-9adf-6b1c11e8330d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:34 crc kubenswrapper[5127]: W0201 07:08:34.497829 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf80e7fbd_d4ce_4c3b_9869_b0c0ebd955fb.slice/crio-29eb6a8736a554e78d4c3df0e0597260b6b9057bcde17336fc80817881f9889c WatchSource:0}: Error finding container 29eb6a8736a554e78d4c3df0e0597260b6b9057bcde17336fc80817881f9889c: Status 404 returned error can't find the container with id 29eb6a8736a554e78d4c3df0e0597260b6b9057bcde17336fc80817881f9889c Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.519636 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.626998 5127 generic.go:334] "Generic (PLEG): container finished" podID="541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" containerID="c3b12d7caca4297d8228c5b7cf055ca56c7e86e5586a04dabab353a4b75a9ad2" exitCode=0 Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.633965 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xxjcv" podStartSLOduration=2.633944999 podStartE2EDuration="2.633944999s" podCreationTimestamp="2026-02-01 07:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:34.623426234 +0000 UTC m=+1265.109328607" watchObservedRunningTime="2026-02-01 07:08:34.633944999 +0000 UTC m=+1265.119847362" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.646842 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.667904 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.693837 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mxrjb" event={"ID":"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8","Type":"ContainerStarted","Data":"be8b2fafef62394ca286e80b2b10a05594b68160bea3ed7da88db6a3a238b33c"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.693883 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-w7586" event={"ID":"4f4d5a37-3a02-493f-9cf9-d53931c2a92b","Type":"ContainerStarted","Data":"53f109a81663f0aab4b3b2f197ee158e05be36f0618b86015ac8d29afd5998fe"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.693902 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.693922 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxjcv" event={"ID":"3e8257dc-a94e-4c4a-987f-8329c2201f78","Type":"ContainerStarted","Data":"4daa23be369e6bde540ee4a93b4b832a252438b593a4db6bfafcc6fee04ab689"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.693945 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-w7586"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.693962 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-ljm24"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.693993 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxjcv" event={"ID":"3e8257dc-a94e-4c4a-987f-8329c2201f78","Type":"ContainerStarted","Data":"d232e26b2ced06affc9bddb2352385b10042bca1db65a4a7067795ab53c01833"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694008 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9xf9" event={"ID":"2d62e96f-7e79-4c05-8c2e-2656ef444f4a","Type":"ContainerStarted","Data":"a698a3a841868d4df8086230dfdd0e708eef55c53f09d7f8be451c0a57b04b71"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694024 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" event={"ID":"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb","Type":"ContainerStarted","Data":"29eb6a8736a554e78d4c3df0e0597260b6b9057bcde17336fc80817881f9889c"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694045 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" event={"ID":"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee","Type":"ContainerDied","Data":"c3b12d7caca4297d8228c5b7cf055ca56c7e86e5586a04dabab353a4b75a9ad2"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694061 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" event={"ID":"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee","Type":"ContainerStarted","Data":"660d765c7ded16fee5689802ad049618f3a92a85edd5aa8c409286fc6aa9996f"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694072 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerStarted","Data":"3757d0918db759aa076fcc4f438945e528d5edbfbb3a417cba097b29b486f0d1"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694084 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssck9" event={"ID":"e206ee74-3e4a-48d2-b7d1-af07cd542f72","Type":"ContainerStarted","Data":"71dd241b4f4df3b35418475820bdfcea918bc9f27f6779114224dec8769f23db"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56c9bc6f5c-45g9w" event={"ID":"626d256d-3b31-4da0-9adf-6b1c11e8330d","Type":"ContainerDied","Data":"dfd1b5ed2296fa9f199abbcff96abbd154ffe17ce80385a0145e1ae5adf01d27"} Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.694120 5127 scope.go:117] "RemoveContainer" containerID="dd161ee891128b5db5505282c5d987d5b4738d7452fa81f4186041b020cc6cfd" Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.785738 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-45g9w"] Feb 01 07:08:34 crc kubenswrapper[5127]: I0201 07:08:34.798254 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56c9bc6f5c-45g9w"] Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.009410 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.107768 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:35 crc kubenswrapper[5127]: W0201 07:08:35.116991 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff74f154_d1ce_44a2_9575_392f0fdf8862.slice/crio-4c9e31cbb2b9744c2e706df154089f121e43fab10ce8b11e75514ec9331a1041 WatchSource:0}: Error finding container 4c9e31cbb2b9744c2e706df154089f121e43fab10ce8b11e75514ec9331a1041: Status 404 returned error can't find the container with id 4c9e31cbb2b9744c2e706df154089f121e43fab10ce8b11e75514ec9331a1041 Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.146889 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-config\") pod \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.146966 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kgl5\" (UniqueName: \"kubernetes.io/projected/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-kube-api-access-5kgl5\") pod \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.147015 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-svc\") pod \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.147038 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-swift-storage-0\") pod \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.147127 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-nb\") pod \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.147195 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-sb\") pod \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\" (UID: \"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee\") " Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.158376 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-kube-api-access-5kgl5" (OuterVolumeSpecName: "kube-api-access-5kgl5") pod "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" (UID: "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee"). InnerVolumeSpecName "kube-api-access-5kgl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.184010 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" (UID: "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.184620 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" (UID: "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.191174 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" (UID: "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.192888 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-config" (OuterVolumeSpecName: "config") pod "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" (UID: "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.194993 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" (UID: "541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.251095 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kgl5\" (UniqueName: \"kubernetes.io/projected/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-kube-api-access-5kgl5\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.251124 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.251136 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.251147 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.251157 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.251169 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.332594 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:35 crc kubenswrapper[5127]: W0201 07:08:35.342431 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode10b34bf_efdd_4cf1_b9c7_a46d199924df.slice/crio-2bb03abdf469a61481ad9dc4d1b38a50baf2084f1478d96e72eba0e24e062e0d WatchSource:0}: Error finding container 2bb03abdf469a61481ad9dc4d1b38a50baf2084f1478d96e72eba0e24e062e0d: Status 404 returned error can't find the container with id 2bb03abdf469a61481ad9dc4d1b38a50baf2084f1478d96e72eba0e24e062e0d Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.678150 5127 generic.go:334] "Generic (PLEG): container finished" podID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerID="c9d8727af711b74e55d846dc4fd82093551dc35a13c0d59410e7ab90746e2061" exitCode=0 Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.678215 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" event={"ID":"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb","Type":"ContainerDied","Data":"c9d8727af711b74e55d846dc4fd82093551dc35a13c0d59410e7ab90746e2061"} Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.692189 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" event={"ID":"541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee","Type":"ContainerDied","Data":"660d765c7ded16fee5689802ad049618f3a92a85edd5aa8c409286fc6aa9996f"} Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.692436 5127 scope.go:117] "RemoveContainer" containerID="c3b12d7caca4297d8228c5b7cf055ca56c7e86e5586a04dabab353a4b75a9ad2" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.692513 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b4bb76d5-n86ld" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.738988 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e10b34bf-efdd-4cf1-b9c7-a46d199924df","Type":"ContainerStarted","Data":"2bb03abdf469a61481ad9dc4d1b38a50baf2084f1478d96e72eba0e24e062e0d"} Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.755717 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9xf9" event={"ID":"2d62e96f-7e79-4c05-8c2e-2656ef444f4a","Type":"ContainerStarted","Data":"1748afa06e5723369ad46d3a5f7b64ed4bb8c6fb93005478eccce92fe1125025"} Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.767437 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff74f154-d1ce-44a2-9575-392f0fdf8862","Type":"ContainerStarted","Data":"4c9e31cbb2b9744c2e706df154089f121e43fab10ce8b11e75514ec9331a1041"} Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.815650 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.816651 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-z9xf9" podStartSLOduration=3.8166337930000003 podStartE2EDuration="3.816633793s" podCreationTimestamp="2026-02-01 07:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:35.778557573 +0000 UTC m=+1266.264459936" watchObservedRunningTime="2026-02-01 07:08:35.816633793 +0000 UTC m=+1266.302536156" Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.868414 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.919708 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-n86ld"] Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.984242 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b4bb76d5-n86ld"] Feb 01 07:08:35 crc kubenswrapper[5127]: I0201 07:08:35.996602 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.252099 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" path="/var/lib/kubelet/pods/541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee/volumes" Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.252986 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626d256d-3b31-4da0-9adf-6b1c11e8330d" path="/var/lib/kubelet/pods/626d256d-3b31-4da0-9adf-6b1c11e8330d/volumes" Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.740812 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.740869 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.740918 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.741673 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0702e12609ce38f8f96c08a0dc24be3679aca29131a880c9fa0e9bf1dfbadcf5"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.741726 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://0702e12609ce38f8f96c08a0dc24be3679aca29131a880c9fa0e9bf1dfbadcf5" gracePeriod=600 Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.785695 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" event={"ID":"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb","Type":"ContainerStarted","Data":"036f77806ebb36bb54244e2f4b5da15343f6663f39a060b67eef97821d693e78"} Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.785782 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.791029 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff74f154-d1ce-44a2-9575-392f0fdf8862","Type":"ContainerStarted","Data":"d22d7dcae43c1982685d1f667ce4e3d6b388196623bf941c1050d0584b39eb06"} Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.793684 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e10b34bf-efdd-4cf1-b9c7-a46d199924df","Type":"ContainerStarted","Data":"a8bcfe304ecee36a0b73c160da1b64f7ef1ece4a6022e68a77ff55538af3bfa5"} Feb 01 07:08:36 crc kubenswrapper[5127]: I0201 07:08:36.810856 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" podStartSLOduration=3.810837886 podStartE2EDuration="3.810837886s" podCreationTimestamp="2026-02-01 07:08:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:36.803689563 +0000 UTC m=+1267.289591986" watchObservedRunningTime="2026-02-01 07:08:36.810837886 +0000 UTC m=+1267.296740249" Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.809643 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff74f154-d1ce-44a2-9575-392f0fdf8862","Type":"ContainerStarted","Data":"b0c943bb980d204f8bfbd69e0679a276d6943e69e61f3da29b1ddad200281be7"} Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.810379 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-log" containerID="cri-o://d22d7dcae43c1982685d1f667ce4e3d6b388196623bf941c1050d0584b39eb06" gracePeriod=30 Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.810850 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-httpd" containerID="cri-o://b0c943bb980d204f8bfbd69e0679a276d6943e69e61f3da29b1ddad200281be7" gracePeriod=30 Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.817074 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="0702e12609ce38f8f96c08a0dc24be3679aca29131a880c9fa0e9bf1dfbadcf5" exitCode=0 Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.817142 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"0702e12609ce38f8f96c08a0dc24be3679aca29131a880c9fa0e9bf1dfbadcf5"} Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.817169 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"ea328ac3a1fecb168f70daa3f3e516c02a9891b33e1e0a73db9093353737c6c6"} Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.817188 5127 scope.go:117] "RemoveContainer" containerID="fea23606e9e9fd1c229db27d18cd60b7a13de794804404b3c4e12726e4ef14d3" Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.824309 5127 generic.go:334] "Generic (PLEG): container finished" podID="3e8257dc-a94e-4c4a-987f-8329c2201f78" containerID="4daa23be369e6bde540ee4a93b4b832a252438b593a4db6bfafcc6fee04ab689" exitCode=0 Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.824381 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxjcv" event={"ID":"3e8257dc-a94e-4c4a-987f-8329c2201f78","Type":"ContainerDied","Data":"4daa23be369e6bde540ee4a93b4b832a252438b593a4db6bfafcc6fee04ab689"} Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.830464 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e10b34bf-efdd-4cf1-b9c7-a46d199924df","Type":"ContainerStarted","Data":"97afb5c5c560a858744a3043431cebb7c1cb7b58970ec303e25790e5622cf74f"} Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.830530 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-log" containerID="cri-o://a8bcfe304ecee36a0b73c160da1b64f7ef1ece4a6022e68a77ff55538af3bfa5" gracePeriod=30 Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.830653 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-httpd" containerID="cri-o://97afb5c5c560a858744a3043431cebb7c1cb7b58970ec303e25790e5622cf74f" gracePeriod=30 Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.852009 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.85198428 podStartE2EDuration="5.85198428s" podCreationTimestamp="2026-02-01 07:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:37.844370984 +0000 UTC m=+1268.330273357" watchObservedRunningTime="2026-02-01 07:08:37.85198428 +0000 UTC m=+1268.337886643" Feb 01 07:08:37 crc kubenswrapper[5127]: I0201 07:08:37.892167 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.892146127 podStartE2EDuration="5.892146127s" podCreationTimestamp="2026-02-01 07:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:37.884929602 +0000 UTC m=+1268.370831985" watchObservedRunningTime="2026-02-01 07:08:37.892146127 +0000 UTC m=+1268.378048490" Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.843068 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerID="b0c943bb980d204f8bfbd69e0679a276d6943e69e61f3da29b1ddad200281be7" exitCode=0 Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.843414 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerID="d22d7dcae43c1982685d1f667ce4e3d6b388196623bf941c1050d0584b39eb06" exitCode=143 Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.843130 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff74f154-d1ce-44a2-9575-392f0fdf8862","Type":"ContainerDied","Data":"b0c943bb980d204f8bfbd69e0679a276d6943e69e61f3da29b1ddad200281be7"} Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.843478 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff74f154-d1ce-44a2-9575-392f0fdf8862","Type":"ContainerDied","Data":"d22d7dcae43c1982685d1f667ce4e3d6b388196623bf941c1050d0584b39eb06"} Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.847691 5127 generic.go:334] "Generic (PLEG): container finished" podID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerID="97afb5c5c560a858744a3043431cebb7c1cb7b58970ec303e25790e5622cf74f" exitCode=0 Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.847711 5127 generic.go:334] "Generic (PLEG): container finished" podID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerID="a8bcfe304ecee36a0b73c160da1b64f7ef1ece4a6022e68a77ff55538af3bfa5" exitCode=143 Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.849104 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e10b34bf-efdd-4cf1-b9c7-a46d199924df","Type":"ContainerDied","Data":"97afb5c5c560a858744a3043431cebb7c1cb7b58970ec303e25790e5622cf74f"} Feb 01 07:08:38 crc kubenswrapper[5127]: I0201 07:08:38.849180 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e10b34bf-efdd-4cf1-b9c7-a46d199924df","Type":"ContainerDied","Data":"a8bcfe304ecee36a0b73c160da1b64f7ef1ece4a6022e68a77ff55538af3bfa5"} Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.378054 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.562612 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-combined-ca-bundle\") pod \"ff74f154-d1ce-44a2-9575-392f0fdf8862\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.562721 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-httpd-run\") pod \"ff74f154-d1ce-44a2-9575-392f0fdf8862\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.562755 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-config-data\") pod \"ff74f154-d1ce-44a2-9575-392f0fdf8862\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.562835 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9v6k\" (UniqueName: \"kubernetes.io/projected/ff74f154-d1ce-44a2-9575-392f0fdf8862-kube-api-access-n9v6k\") pod \"ff74f154-d1ce-44a2-9575-392f0fdf8862\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.562884 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-logs\") pod \"ff74f154-d1ce-44a2-9575-392f0fdf8862\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.562935 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ff74f154-d1ce-44a2-9575-392f0fdf8862\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.562964 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-scripts\") pod \"ff74f154-d1ce-44a2-9575-392f0fdf8862\" (UID: \"ff74f154-d1ce-44a2-9575-392f0fdf8862\") " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.563270 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff74f154-d1ce-44a2-9575-392f0fdf8862" (UID: "ff74f154-d1ce-44a2-9575-392f0fdf8862"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.563682 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.564533 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-logs" (OuterVolumeSpecName: "logs") pod "ff74f154-d1ce-44a2-9575-392f0fdf8862" (UID: "ff74f154-d1ce-44a2-9575-392f0fdf8862"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.570247 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "ff74f154-d1ce-44a2-9575-392f0fdf8862" (UID: "ff74f154-d1ce-44a2-9575-392f0fdf8862"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.571070 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-scripts" (OuterVolumeSpecName: "scripts") pod "ff74f154-d1ce-44a2-9575-392f0fdf8862" (UID: "ff74f154-d1ce-44a2-9575-392f0fdf8862"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.572841 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff74f154-d1ce-44a2-9575-392f0fdf8862-kube-api-access-n9v6k" (OuterVolumeSpecName: "kube-api-access-n9v6k") pod "ff74f154-d1ce-44a2-9575-392f0fdf8862" (UID: "ff74f154-d1ce-44a2-9575-392f0fdf8862"). InnerVolumeSpecName "kube-api-access-n9v6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.599376 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff74f154-d1ce-44a2-9575-392f0fdf8862" (UID: "ff74f154-d1ce-44a2-9575-392f0fdf8862"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.619268 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-config-data" (OuterVolumeSpecName: "config-data") pod "ff74f154-d1ce-44a2-9575-392f0fdf8862" (UID: "ff74f154-d1ce-44a2-9575-392f0fdf8862"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.665405 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.665442 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.665454 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.665466 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff74f154-d1ce-44a2-9575-392f0fdf8862-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.665478 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9v6k\" (UniqueName: \"kubernetes.io/projected/ff74f154-d1ce-44a2-9575-392f0fdf8862-kube-api-access-n9v6k\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.665491 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff74f154-d1ce-44a2-9575-392f0fdf8862-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.690782 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.767021 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.892482 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff74f154-d1ce-44a2-9575-392f0fdf8862","Type":"ContainerDied","Data":"4c9e31cbb2b9744c2e706df154089f121e43fab10ce8b11e75514ec9331a1041"} Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.892564 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.900696 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.965905 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:43 crc kubenswrapper[5127]: I0201 07:08:43.974445 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.003253 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-l66pl"] Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.003790 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="dnsmasq-dns" containerID="cri-o://b9ff0e88c0e37b2e9fadda6a2bad473d2a35e03ce64018a8297df33d9e81fbbe" gracePeriod=10 Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.010912 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:44 crc kubenswrapper[5127]: E0201 07:08:44.011347 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-log" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.011366 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-log" Feb 01 07:08:44 crc kubenswrapper[5127]: E0201 07:08:44.011398 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" containerName="init" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.011404 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" containerName="init" Feb 01 07:08:44 crc kubenswrapper[5127]: E0201 07:08:44.011418 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-httpd" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.011424 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-httpd" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.011571 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-httpd" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.011597 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="541e2ff2-02cc-4f9c-aa18-b6fe2fbe70ee" containerName="init" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.011624 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" containerName="glance-log" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.012434 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.018167 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.018690 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.037062 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202455 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202503 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202548 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcwds\" (UniqueName: \"kubernetes.io/projected/4198b002-5dbd-4855-97f1-bae36fd86bf5-kube-api-access-wcwds\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202605 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202654 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-config-data\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202714 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202739 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-scripts\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.202883 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-logs\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.247140 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff74f154-d1ce-44a2-9575-392f0fdf8862" path="/var/lib/kubelet/pods/ff74f154-d1ce-44a2-9575-392f0fdf8862/volumes" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304542 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304590 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304687 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcwds\" (UniqueName: \"kubernetes.io/projected/4198b002-5dbd-4855-97f1-bae36fd86bf5-kube-api-access-wcwds\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304723 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304740 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-config-data\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304767 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304786 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-scripts\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304819 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-logs\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.304916 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.305278 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-logs\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.305275 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.311065 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-scripts\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.311659 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.334875 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-config-data\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.335467 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.341536 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.355560 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcwds\" (UniqueName: \"kubernetes.io/projected/4198b002-5dbd-4855-97f1-bae36fd86bf5-kube-api-access-wcwds\") pod \"glance-default-external-api-0\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.641935 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.910165 5127 generic.go:334] "Generic (PLEG): container finished" podID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerID="b9ff0e88c0e37b2e9fadda6a2bad473d2a35e03ce64018a8297df33d9e81fbbe" exitCode=0 Feb 01 07:08:44 crc kubenswrapper[5127]: I0201 07:08:44.910223 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" event={"ID":"e3ad894b-32c0-4283-839b-e29bf71b1381","Type":"ContainerDied","Data":"b9ff0e88c0e37b2e9fadda6a2bad473d2a35e03ce64018a8297df33d9e81fbbe"} Feb 01 07:08:46 crc kubenswrapper[5127]: I0201 07:08:46.294763 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Feb 01 07:08:47 crc kubenswrapper[5127]: E0201 07:08:47.394333 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Feb 01 07:08:47 crc kubenswrapper[5127]: E0201 07:08:47.394463 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zx6jc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-mxrjb_openstack(7692c2d1-b96e-4d2d-b0b8-039a5125c9b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 07:08:47 crc kubenswrapper[5127]: E0201 07:08:47.395649 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-mxrjb" podUID="7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.505701 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.669345 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-credential-keys\") pod \"3e8257dc-a94e-4c4a-987f-8329c2201f78\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.669685 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-scripts\") pod \"3e8257dc-a94e-4c4a-987f-8329c2201f78\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.669722 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l8qd\" (UniqueName: \"kubernetes.io/projected/3e8257dc-a94e-4c4a-987f-8329c2201f78-kube-api-access-9l8qd\") pod \"3e8257dc-a94e-4c4a-987f-8329c2201f78\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.669866 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-combined-ca-bundle\") pod \"3e8257dc-a94e-4c4a-987f-8329c2201f78\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.669992 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-config-data\") pod \"3e8257dc-a94e-4c4a-987f-8329c2201f78\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.670052 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-fernet-keys\") pod \"3e8257dc-a94e-4c4a-987f-8329c2201f78\" (UID: \"3e8257dc-a94e-4c4a-987f-8329c2201f78\") " Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.678080 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e8257dc-a94e-4c4a-987f-8329c2201f78-kube-api-access-9l8qd" (OuterVolumeSpecName: "kube-api-access-9l8qd") pod "3e8257dc-a94e-4c4a-987f-8329c2201f78" (UID: "3e8257dc-a94e-4c4a-987f-8329c2201f78"). InnerVolumeSpecName "kube-api-access-9l8qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.683568 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3e8257dc-a94e-4c4a-987f-8329c2201f78" (UID: "3e8257dc-a94e-4c4a-987f-8329c2201f78"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.695794 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-scripts" (OuterVolumeSpecName: "scripts") pod "3e8257dc-a94e-4c4a-987f-8329c2201f78" (UID: "3e8257dc-a94e-4c4a-987f-8329c2201f78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.695904 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3e8257dc-a94e-4c4a-987f-8329c2201f78" (UID: "3e8257dc-a94e-4c4a-987f-8329c2201f78"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.726334 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e8257dc-a94e-4c4a-987f-8329c2201f78" (UID: "3e8257dc-a94e-4c4a-987f-8329c2201f78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.729567 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-config-data" (OuterVolumeSpecName: "config-data") pod "3e8257dc-a94e-4c4a-987f-8329c2201f78" (UID: "3e8257dc-a94e-4c4a-987f-8329c2201f78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.772013 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.772040 5127 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.772049 5127 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.772058 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.772067 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l8qd\" (UniqueName: \"kubernetes.io/projected/3e8257dc-a94e-4c4a-987f-8329c2201f78-kube-api-access-9l8qd\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.772074 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e8257dc-a94e-4c4a-987f-8329c2201f78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.941142 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxjcv" event={"ID":"3e8257dc-a94e-4c4a-987f-8329c2201f78","Type":"ContainerDied","Data":"d232e26b2ced06affc9bddb2352385b10042bca1db65a4a7067795ab53c01833"} Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.941182 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxjcv" Feb 01 07:08:47 crc kubenswrapper[5127]: I0201 07:08:47.941186 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d232e26b2ced06affc9bddb2352385b10042bca1db65a4a7067795ab53c01833" Feb 01 07:08:47 crc kubenswrapper[5127]: E0201 07:08:47.962926 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-mxrjb" podUID="7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.599543 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xxjcv"] Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.607881 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xxjcv"] Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.694989 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-z8dx7"] Feb 01 07:08:48 crc kubenswrapper[5127]: E0201 07:08:48.695407 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e8257dc-a94e-4c4a-987f-8329c2201f78" containerName="keystone-bootstrap" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.695433 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e8257dc-a94e-4c4a-987f-8329c2201f78" containerName="keystone-bootstrap" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.695655 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e8257dc-a94e-4c4a-987f-8329c2201f78" containerName="keystone-bootstrap" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.696233 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.698772 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.699051 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.699273 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7vjtc" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.699316 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.699415 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.711476 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z8dx7"] Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.794581 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-combined-ca-bundle\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.794934 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-scripts\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.794993 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-fernet-keys\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.795014 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqbqm\" (UniqueName: \"kubernetes.io/projected/ee5d487e-8c6a-431b-b720-4b242eec1c40-kube-api-access-vqbqm\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.795207 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-config-data\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.795399 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-credential-keys\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.896810 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-combined-ca-bundle\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.896861 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-scripts\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.896915 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-fernet-keys\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.896949 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqbqm\" (UniqueName: \"kubernetes.io/projected/ee5d487e-8c6a-431b-b720-4b242eec1c40-kube-api-access-vqbqm\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.896998 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-config-data\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.897062 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-credential-keys\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.903410 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-config-data\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.913368 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqbqm\" (UniqueName: \"kubernetes.io/projected/ee5d487e-8c6a-431b-b720-4b242eec1c40-kube-api-access-vqbqm\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.917320 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-combined-ca-bundle\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.917326 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-scripts\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.920042 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-credential-keys\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:48 crc kubenswrapper[5127]: I0201 07:08:48.923413 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-fernet-keys\") pod \"keystone-bootstrap-z8dx7\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:49 crc kubenswrapper[5127]: I0201 07:08:49.020822 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:08:50 crc kubenswrapper[5127]: I0201 07:08:50.246307 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e8257dc-a94e-4c4a-987f-8329c2201f78" path="/var/lib/kubelet/pods/3e8257dc-a94e-4c4a-987f-8329c2201f78/volumes" Feb 01 07:08:51 crc kubenswrapper[5127]: I0201 07:08:51.294368 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.112186 5127 scope.go:117] "RemoveContainer" containerID="b0c943bb980d204f8bfbd69e0679a276d6943e69e61f3da29b1ddad200281be7" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.165738 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.238332 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-logs\") pod \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.238374 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-scripts\") pod \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.238420 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vslsx\" (UniqueName: \"kubernetes.io/projected/e10b34bf-efdd-4cf1-b9c7-a46d199924df-kube-api-access-vslsx\") pod \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.238436 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.238540 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-combined-ca-bundle\") pod \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.238574 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-config-data\") pod \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.238630 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-httpd-run\") pod \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\" (UID: \"e10b34bf-efdd-4cf1-b9c7-a46d199924df\") " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.239235 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e10b34bf-efdd-4cf1-b9c7-a46d199924df" (UID: "e10b34bf-efdd-4cf1-b9c7-a46d199924df"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.239530 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-logs" (OuterVolumeSpecName: "logs") pod "e10b34bf-efdd-4cf1-b9c7-a46d199924df" (UID: "e10b34bf-efdd-4cf1-b9c7-a46d199924df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.251276 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-scripts" (OuterVolumeSpecName: "scripts") pod "e10b34bf-efdd-4cf1-b9c7-a46d199924df" (UID: "e10b34bf-efdd-4cf1-b9c7-a46d199924df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.253903 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e10b34bf-efdd-4cf1-b9c7-a46d199924df-kube-api-access-vslsx" (OuterVolumeSpecName: "kube-api-access-vslsx") pod "e10b34bf-efdd-4cf1-b9c7-a46d199924df" (UID: "e10b34bf-efdd-4cf1-b9c7-a46d199924df"). InnerVolumeSpecName "kube-api-access-vslsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.260076 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "e10b34bf-efdd-4cf1-b9c7-a46d199924df" (UID: "e10b34bf-efdd-4cf1-b9c7-a46d199924df"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.268773 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e10b34bf-efdd-4cf1-b9c7-a46d199924df" (UID: "e10b34bf-efdd-4cf1-b9c7-a46d199924df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.302319 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-config-data" (OuterVolumeSpecName: "config-data") pod "e10b34bf-efdd-4cf1-b9c7-a46d199924df" (UID: "e10b34bf-efdd-4cf1-b9c7-a46d199924df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.340810 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.340844 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.340867 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.340882 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vslsx\" (UniqueName: \"kubernetes.io/projected/e10b34bf-efdd-4cf1-b9c7-a46d199924df-kube-api-access-vslsx\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.340894 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.340907 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e10b34bf-efdd-4cf1-b9c7-a46d199924df-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.340917 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e10b34bf-efdd-4cf1-b9c7-a46d199924df-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.362425 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 01 07:08:55 crc kubenswrapper[5127]: I0201 07:08:55.442758 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.046516 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e10b34bf-efdd-4cf1-b9c7-a46d199924df","Type":"ContainerDied","Data":"2bb03abdf469a61481ad9dc4d1b38a50baf2084f1478d96e72eba0e24e062e0d"} Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.046662 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.089110 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.099330 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.121559 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:56 crc kubenswrapper[5127]: E0201 07:08:56.124268 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-log" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.124369 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-log" Feb 01 07:08:56 crc kubenswrapper[5127]: E0201 07:08:56.124524 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-httpd" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.124660 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-httpd" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.124934 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-log" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.126276 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" containerName="glance-httpd" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.127628 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.130438 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.130646 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.137553 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.245945 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e10b34bf-efdd-4cf1-b9c7-a46d199924df" path="/var/lib/kubelet/pods/e10b34bf-efdd-4cf1-b9c7-a46d199924df/volumes" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256457 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256505 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256614 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-config-data\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256663 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6hq4\" (UniqueName: \"kubernetes.io/projected/541316fd-1125-4922-8791-4c40a7188768-kube-api-access-r6hq4\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256710 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256738 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-scripts\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256761 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.256819 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-logs\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359103 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-config-data\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359201 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6hq4\" (UniqueName: \"kubernetes.io/projected/541316fd-1125-4922-8791-4c40a7188768-kube-api-access-r6hq4\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359240 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359265 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-scripts\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359286 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359326 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-logs\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359491 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.359516 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.360008 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.360192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-logs\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.361129 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.367080 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.367266 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.369031 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-config-data\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.372524 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-scripts\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.388827 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6hq4\" (UniqueName: \"kubernetes.io/projected/541316fd-1125-4922-8791-4c40a7188768-kube-api-access-r6hq4\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.389253 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:08:56 crc kubenswrapper[5127]: I0201 07:08:56.452535 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:08:57 crc kubenswrapper[5127]: E0201 07:08:57.243415 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Feb 01 07:08:57 crc kubenswrapper[5127]: E0201 07:08:57.244156 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5sjg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-w7586_openstack(4f4d5a37-3a02-493f-9cf9-d53931c2a92b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 07:08:57 crc kubenswrapper[5127]: E0201 07:08:57.245742 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-w7586" podUID="4f4d5a37-3a02-493f-9cf9-d53931c2a92b" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.252606 5127 scope.go:117] "RemoveContainer" containerID="d22d7dcae43c1982685d1f667ce4e3d6b388196623bf941c1050d0584b39eb06" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.463201 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.465489 5127 scope.go:117] "RemoveContainer" containerID="97afb5c5c560a858744a3043431cebb7c1cb7b58970ec303e25790e5622cf74f" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.525905 5127 scope.go:117] "RemoveContainer" containerID="a8bcfe304ecee36a0b73c160da1b64f7ef1ece4a6022e68a77ff55538af3bfa5" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.590097 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-config\") pod \"e3ad894b-32c0-4283-839b-e29bf71b1381\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.590159 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-nb\") pod \"e3ad894b-32c0-4283-839b-e29bf71b1381\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.590199 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-dns-svc\") pod \"e3ad894b-32c0-4283-839b-e29bf71b1381\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.590304 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/e3ad894b-32c0-4283-839b-e29bf71b1381-kube-api-access-4h2zn\") pod \"e3ad894b-32c0-4283-839b-e29bf71b1381\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.590353 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-sb\") pod \"e3ad894b-32c0-4283-839b-e29bf71b1381\" (UID: \"e3ad894b-32c0-4283-839b-e29bf71b1381\") " Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.598352 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3ad894b-32c0-4283-839b-e29bf71b1381-kube-api-access-4h2zn" (OuterVolumeSpecName: "kube-api-access-4h2zn") pod "e3ad894b-32c0-4283-839b-e29bf71b1381" (UID: "e3ad894b-32c0-4283-839b-e29bf71b1381"). InnerVolumeSpecName "kube-api-access-4h2zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.652799 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e3ad894b-32c0-4283-839b-e29bf71b1381" (UID: "e3ad894b-32c0-4283-839b-e29bf71b1381"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.653361 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e3ad894b-32c0-4283-839b-e29bf71b1381" (UID: "e3ad894b-32c0-4283-839b-e29bf71b1381"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.661920 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-config" (OuterVolumeSpecName: "config") pod "e3ad894b-32c0-4283-839b-e29bf71b1381" (UID: "e3ad894b-32c0-4283-839b-e29bf71b1381"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.677910 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3ad894b-32c0-4283-839b-e29bf71b1381" (UID: "e3ad894b-32c0-4283-839b-e29bf71b1381"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.692499 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.692543 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.692556 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.692570 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h2zn\" (UniqueName: \"kubernetes.io/projected/e3ad894b-32c0-4283-839b-e29bf71b1381-kube-api-access-4h2zn\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.692600 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3ad894b-32c0-4283-839b-e29bf71b1381-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.806764 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:08:57 crc kubenswrapper[5127]: W0201 07:08:57.809039 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4198b002_5dbd_4855_97f1_bae36fd86bf5.slice/crio-1ee9603ae51259dfae420ca57c3ed1fd060d26a029054ae3aac803505eda39b4 WatchSource:0}: Error finding container 1ee9603ae51259dfae420ca57c3ed1fd060d26a029054ae3aac803505eda39b4: Status 404 returned error can't find the container with id 1ee9603ae51259dfae420ca57c3ed1fd060d26a029054ae3aac803505eda39b4 Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.832150 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z8dx7"] Feb 01 07:08:57 crc kubenswrapper[5127]: W0201 07:08:57.840088 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee5d487e_8c6a_431b_b720_4b242eec1c40.slice/crio-76253a2b48c122d66edec39139a8dfe973f0bbe59a76f7841445005a5c96ac08 WatchSource:0}: Error finding container 76253a2b48c122d66edec39139a8dfe973f0bbe59a76f7841445005a5c96ac08: Status 404 returned error can't find the container with id 76253a2b48c122d66edec39139a8dfe973f0bbe59a76f7841445005a5c96ac08 Feb 01 07:08:57 crc kubenswrapper[5127]: I0201 07:08:57.978321 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:08:57 crc kubenswrapper[5127]: W0201 07:08:57.990686 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod541316fd_1125_4922_8791_4c40a7188768.slice/crio-bdd701ae3852e570566e8b7bb18ddcc50af81bbce0d6dc305a27bfdc29d892ad WatchSource:0}: Error finding container bdd701ae3852e570566e8b7bb18ddcc50af81bbce0d6dc305a27bfdc29d892ad: Status 404 returned error can't find the container with id bdd701ae3852e570566e8b7bb18ddcc50af81bbce0d6dc305a27bfdc29d892ad Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.068859 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"541316fd-1125-4922-8791-4c40a7188768","Type":"ContainerStarted","Data":"bdd701ae3852e570566e8b7bb18ddcc50af81bbce0d6dc305a27bfdc29d892ad"} Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.072233 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8dx7" event={"ID":"ee5d487e-8c6a-431b-b720-4b242eec1c40","Type":"ContainerStarted","Data":"3488f73206b087a954bbef94f7cd739bdad5a3478cc5450889c6ba48c10a6d60"} Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.072266 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8dx7" event={"ID":"ee5d487e-8c6a-431b-b720-4b242eec1c40","Type":"ContainerStarted","Data":"76253a2b48c122d66edec39139a8dfe973f0bbe59a76f7841445005a5c96ac08"} Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.076548 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerStarted","Data":"2fcf09cf738ee71b3db6533f0029a17da44d8dbcd16a1db45a01f5156188fe21"} Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.084691 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssck9" event={"ID":"e206ee74-3e4a-48d2-b7d1-af07cd542f72","Type":"ContainerStarted","Data":"2dd5e354f57ec00475cc69e5f8c37183bd10c5dcd5115dcaf503de25d881f18b"} Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.088898 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4198b002-5dbd-4855-97f1-bae36fd86bf5","Type":"ContainerStarted","Data":"1ee9603ae51259dfae420ca57c3ed1fd060d26a029054ae3aac803505eda39b4"} Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.094417 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-z8dx7" podStartSLOduration=10.094349067 podStartE2EDuration="10.094349067s" podCreationTimestamp="2026-02-01 07:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:08:58.093731799 +0000 UTC m=+1288.579634182" watchObservedRunningTime="2026-02-01 07:08:58.094349067 +0000 UTC m=+1288.580251430" Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.100351 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" event={"ID":"e3ad894b-32c0-4283-839b-e29bf71b1381","Type":"ContainerDied","Data":"f27c8b04c719e63b9c184fe5e268eb2b7a87188a0f82e75c6511c49d2d6fe91f"} Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.100381 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.100396 5127 scope.go:117] "RemoveContainer" containerID="b9ff0e88c0e37b2e9fadda6a2bad473d2a35e03ce64018a8297df33d9e81fbbe" Feb 01 07:08:58 crc kubenswrapper[5127]: E0201 07:08:58.101939 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-w7586" podUID="4f4d5a37-3a02-493f-9cf9-d53931c2a92b" Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.114210 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ssck9" podStartSLOduration=2.116190109 podStartE2EDuration="25.114187204s" podCreationTimestamp="2026-02-01 07:08:33 +0000 UTC" firstStartedPulling="2026-02-01 07:08:34.173348555 +0000 UTC m=+1264.659250918" lastFinishedPulling="2026-02-01 07:08:57.17134562 +0000 UTC m=+1287.657248013" observedRunningTime="2026-02-01 07:08:58.110735479 +0000 UTC m=+1288.596637842" watchObservedRunningTime="2026-02-01 07:08:58.114187204 +0000 UTC m=+1288.600089577" Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.160139 5127 scope.go:117] "RemoveContainer" containerID="7a0917b4ceb20433e417aa83fd15f81d56a2a23d54172cd546aa05dc4a139d45" Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.181121 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-l66pl"] Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.187374 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-l66pl"] Feb 01 07:08:58 crc kubenswrapper[5127]: I0201 07:08:58.256859 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" path="/var/lib/kubelet/pods/e3ad894b-32c0-4283-839b-e29bf71b1381/volumes" Feb 01 07:08:59 crc kubenswrapper[5127]: I0201 07:08:59.118940 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerStarted","Data":"28ed629b545fff6fa4fe5b198e94d2adaf4720408ded6f5999779c65743ee58c"} Feb 01 07:08:59 crc kubenswrapper[5127]: I0201 07:08:59.130081 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4198b002-5dbd-4855-97f1-bae36fd86bf5","Type":"ContainerStarted","Data":"de803807ce8b3fdcbb5bdcaab8ac82b578e091733368f0cadd7095da580a6a90"} Feb 01 07:08:59 crc kubenswrapper[5127]: I0201 07:08:59.133748 5127 generic.go:334] "Generic (PLEG): container finished" podID="2d62e96f-7e79-4c05-8c2e-2656ef444f4a" containerID="1748afa06e5723369ad46d3a5f7b64ed4bb8c6fb93005478eccce92fe1125025" exitCode=0 Feb 01 07:08:59 crc kubenswrapper[5127]: I0201 07:08:59.133855 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9xf9" event={"ID":"2d62e96f-7e79-4c05-8c2e-2656ef444f4a","Type":"ContainerDied","Data":"1748afa06e5723369ad46d3a5f7b64ed4bb8c6fb93005478eccce92fe1125025"} Feb 01 07:08:59 crc kubenswrapper[5127]: I0201 07:08:59.137940 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"541316fd-1125-4922-8791-4c40a7188768","Type":"ContainerStarted","Data":"2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459"} Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.150278 5127 generic.go:334] "Generic (PLEG): container finished" podID="e206ee74-3e4a-48d2-b7d1-af07cd542f72" containerID="2dd5e354f57ec00475cc69e5f8c37183bd10c5dcd5115dcaf503de25d881f18b" exitCode=0 Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.150378 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssck9" event={"ID":"e206ee74-3e4a-48d2-b7d1-af07cd542f72","Type":"ContainerDied","Data":"2dd5e354f57ec00475cc69e5f8c37183bd10c5dcd5115dcaf503de25d881f18b"} Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.161608 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4198b002-5dbd-4855-97f1-bae36fd86bf5","Type":"ContainerStarted","Data":"e429fa501c99bbff52997ad8cb9d22463911edc7607c645c63b1029454eb35b5"} Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.164676 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"541316fd-1125-4922-8791-4c40a7188768","Type":"ContainerStarted","Data":"5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078"} Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.174237 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mxrjb" event={"ID":"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8","Type":"ContainerStarted","Data":"12815831357c053ba52d5cd1f3f3dd72afee59a298bcd5c16a3985ed369f9e18"} Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.244120 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.24409819 podStartE2EDuration="4.24409819s" podCreationTimestamp="2026-02-01 07:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:00.204866028 +0000 UTC m=+1290.690768401" watchObservedRunningTime="2026-02-01 07:09:00.24409819 +0000 UTC m=+1290.730000543" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.244649 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.244642864 podStartE2EDuration="17.244642864s" podCreationTimestamp="2026-02-01 07:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:00.230185353 +0000 UTC m=+1290.716087716" watchObservedRunningTime="2026-02-01 07:09:00.244642864 +0000 UTC m=+1290.730545227" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.254700 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-mxrjb" podStartSLOduration=1.700662585 podStartE2EDuration="27.254678416s" podCreationTimestamp="2026-02-01 07:08:33 +0000 UTC" firstStartedPulling="2026-02-01 07:08:34.137529246 +0000 UTC m=+1264.623431609" lastFinishedPulling="2026-02-01 07:08:59.691545077 +0000 UTC m=+1290.177447440" observedRunningTime="2026-02-01 07:09:00.254047719 +0000 UTC m=+1290.739950092" watchObservedRunningTime="2026-02-01 07:09:00.254678416 +0000 UTC m=+1290.740580779" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.536533 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.682889 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g54fr\" (UniqueName: \"kubernetes.io/projected/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-kube-api-access-g54fr\") pod \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.683064 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-combined-ca-bundle\") pod \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.683288 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-config\") pod \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\" (UID: \"2d62e96f-7e79-4c05-8c2e-2656ef444f4a\") " Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.687966 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-kube-api-access-g54fr" (OuterVolumeSpecName: "kube-api-access-g54fr") pod "2d62e96f-7e79-4c05-8c2e-2656ef444f4a" (UID: "2d62e96f-7e79-4c05-8c2e-2656ef444f4a"). InnerVolumeSpecName "kube-api-access-g54fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.713598 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-config" (OuterVolumeSpecName: "config") pod "2d62e96f-7e79-4c05-8c2e-2656ef444f4a" (UID: "2d62e96f-7e79-4c05-8c2e-2656ef444f4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.714908 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d62e96f-7e79-4c05-8c2e-2656ef444f4a" (UID: "2d62e96f-7e79-4c05-8c2e-2656ef444f4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.785203 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.785232 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:00 crc kubenswrapper[5127]: I0201 07:09:00.785242 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g54fr\" (UniqueName: \"kubernetes.io/projected/2d62e96f-7e79-4c05-8c2e-2656ef444f4a-kube-api-access-g54fr\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.242892 5127 generic.go:334] "Generic (PLEG): container finished" podID="ee5d487e-8c6a-431b-b720-4b242eec1c40" containerID="3488f73206b087a954bbef94f7cd739bdad5a3478cc5450889c6ba48c10a6d60" exitCode=0 Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.242962 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8dx7" event={"ID":"ee5d487e-8c6a-431b-b720-4b242eec1c40","Type":"ContainerDied","Data":"3488f73206b087a954bbef94f7cd739bdad5a3478cc5450889c6ba48c10a6d60"} Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.249501 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z9xf9" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.254963 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z9xf9" event={"ID":"2d62e96f-7e79-4c05-8c2e-2656ef444f4a","Type":"ContainerDied","Data":"a698a3a841868d4df8086230dfdd0e708eef55c53f09d7f8be451c0a57b04b71"} Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.255028 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a698a3a841868d4df8086230dfdd0e708eef55c53f09d7f8be451c0a57b04b71" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.295856 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-l66pl" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.345778 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-wcxl4"] Feb 01 07:09:01 crc kubenswrapper[5127]: E0201 07:09:01.346158 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="dnsmasq-dns" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.346173 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="dnsmasq-dns" Feb 01 07:09:01 crc kubenswrapper[5127]: E0201 07:09:01.346186 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="init" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.346192 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="init" Feb 01 07:09:01 crc kubenswrapper[5127]: E0201 07:09:01.346201 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d62e96f-7e79-4c05-8c2e-2656ef444f4a" containerName="neutron-db-sync" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.346207 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d62e96f-7e79-4c05-8c2e-2656ef444f4a" containerName="neutron-db-sync" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.346386 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3ad894b-32c0-4283-839b-e29bf71b1381" containerName="dnsmasq-dns" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.346402 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d62e96f-7e79-4c05-8c2e-2656ef444f4a" containerName="neutron-db-sync" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.347281 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.388217 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-wcxl4"] Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.402731 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-769f857fd8-mc6lf"] Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.404407 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.418892 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2pfbp" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.419072 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.419177 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.419309 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.434536 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-769f857fd8-mc6lf"] Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.507721 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.507797 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zck9\" (UniqueName: \"kubernetes.io/projected/e87ee524-fbce-45ca-b3fb-e6b59a739f73-kube-api-access-9zck9\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.507832 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-config\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.507873 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.507915 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-httpd-config\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.507952 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.507986 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d55g\" (UniqueName: \"kubernetes.io/projected/dc932a0b-f98d-4426-85d6-493c51f87a39-kube-api-access-9d55g\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.508009 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-ovndb-tls-certs\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.508047 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.508081 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-config\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.508109 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-combined-ca-bundle\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.609902 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zck9\" (UniqueName: \"kubernetes.io/projected/e87ee524-fbce-45ca-b3fb-e6b59a739f73-kube-api-access-9zck9\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.609957 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-config\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.610005 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.610048 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-httpd-config\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.610085 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.611214 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.611327 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d55g\" (UniqueName: \"kubernetes.io/projected/dc932a0b-f98d-4426-85d6-493c51f87a39-kube-api-access-9d55g\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.611360 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-ovndb-tls-certs\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.611396 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.611422 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-config\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.611442 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-combined-ca-bundle\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.611477 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.613925 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.614072 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.614701 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.615333 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-config\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.619469 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-config\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.620999 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-combined-ca-bundle\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.626880 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zck9\" (UniqueName: \"kubernetes.io/projected/e87ee524-fbce-45ca-b3fb-e6b59a739f73-kube-api-access-9zck9\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.629709 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-ovndb-tls-certs\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.633767 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-httpd-config\") pod \"neutron-769f857fd8-mc6lf\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.634945 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d55g\" (UniqueName: \"kubernetes.io/projected/dc932a0b-f98d-4426-85d6-493c51f87a39-kube-api-access-9d55g\") pod \"dnsmasq-dns-6b9c8b59c-wcxl4\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.679470 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:01 crc kubenswrapper[5127]: I0201 07:09:01.736544 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.467042 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssck9" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.628942 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq7sl\" (UniqueName: \"kubernetes.io/projected/e206ee74-3e4a-48d2-b7d1-af07cd542f72-kube-api-access-mq7sl\") pod \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.629388 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-scripts\") pod \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.629422 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e206ee74-3e4a-48d2-b7d1-af07cd542f72-logs\") pod \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.629494 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-config-data\") pod \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.629539 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-combined-ca-bundle\") pod \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\" (UID: \"e206ee74-3e4a-48d2-b7d1-af07cd542f72\") " Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.630281 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e206ee74-3e4a-48d2-b7d1-af07cd542f72-logs" (OuterVolumeSpecName: "logs") pod "e206ee74-3e4a-48d2-b7d1-af07cd542f72" (UID: "e206ee74-3e4a-48d2-b7d1-af07cd542f72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.630394 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e206ee74-3e4a-48d2-b7d1-af07cd542f72-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.636010 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-scripts" (OuterVolumeSpecName: "scripts") pod "e206ee74-3e4a-48d2-b7d1-af07cd542f72" (UID: "e206ee74-3e4a-48d2-b7d1-af07cd542f72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.636545 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e206ee74-3e4a-48d2-b7d1-af07cd542f72-kube-api-access-mq7sl" (OuterVolumeSpecName: "kube-api-access-mq7sl") pod "e206ee74-3e4a-48d2-b7d1-af07cd542f72" (UID: "e206ee74-3e4a-48d2-b7d1-af07cd542f72"). InnerVolumeSpecName "kube-api-access-mq7sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.657450 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e206ee74-3e4a-48d2-b7d1-af07cd542f72" (UID: "e206ee74-3e4a-48d2-b7d1-af07cd542f72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.662194 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-config-data" (OuterVolumeSpecName: "config-data") pod "e206ee74-3e4a-48d2-b7d1-af07cd542f72" (UID: "e206ee74-3e4a-48d2-b7d1-af07cd542f72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.732145 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.732181 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.732193 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e206ee74-3e4a-48d2-b7d1-af07cd542f72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:02 crc kubenswrapper[5127]: I0201 07:09:02.732207 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq7sl\" (UniqueName: \"kubernetes.io/projected/e206ee74-3e4a-48d2-b7d1-af07cd542f72-kube-api-access-mq7sl\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.265142 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ssck9" event={"ID":"e206ee74-3e4a-48d2-b7d1-af07cd542f72","Type":"ContainerDied","Data":"71dd241b4f4df3b35418475820bdfcea918bc9f27f6779114224dec8769f23db"} Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.265474 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71dd241b4f4df3b35418475820bdfcea918bc9f27f6779114224dec8769f23db" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.265172 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ssck9" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.267032 5127 generic.go:334] "Generic (PLEG): container finished" podID="7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" containerID="12815831357c053ba52d5cd1f3f3dd72afee59a298bcd5c16a3985ed369f9e18" exitCode=0 Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.267073 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mxrjb" event={"ID":"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8","Type":"ContainerDied","Data":"12815831357c053ba52d5cd1f3f3dd72afee59a298bcd5c16a3985ed369f9e18"} Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.587082 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-785b58c67b-rrzfw"] Feb 01 07:09:03 crc kubenswrapper[5127]: E0201 07:09:03.587480 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e206ee74-3e4a-48d2-b7d1-af07cd542f72" containerName="placement-db-sync" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.587493 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e206ee74-3e4a-48d2-b7d1-af07cd542f72" containerName="placement-db-sync" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.587662 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e206ee74-3e4a-48d2-b7d1-af07cd542f72" containerName="placement-db-sync" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.588446 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.593112 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.593415 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.593593 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.593662 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.595551 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qqq8k" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.604004 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-785b58c67b-rrzfw"] Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.774818 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-internal-tls-certs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.774932 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzhpv\" (UniqueName: \"kubernetes.io/projected/0da4cb95-6224-41e2-9adc-4d0d56a0c162-kube-api-access-gzhpv\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.774965 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-scripts\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.775005 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0da4cb95-6224-41e2-9adc-4d0d56a0c162-logs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.775114 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-public-tls-certs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.775163 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-config-data\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.775184 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-combined-ca-bundle\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.876198 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-public-tls-certs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.876296 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-config-data\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.876322 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-combined-ca-bundle\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.876346 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-internal-tls-certs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.876375 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzhpv\" (UniqueName: \"kubernetes.io/projected/0da4cb95-6224-41e2-9adc-4d0d56a0c162-kube-api-access-gzhpv\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.876399 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-scripts\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.876429 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0da4cb95-6224-41e2-9adc-4d0d56a0c162-logs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.877648 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0da4cb95-6224-41e2-9adc-4d0d56a0c162-logs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.881752 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-scripts\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.882164 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-config-data\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.883285 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-public-tls-certs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.883289 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-internal-tls-certs\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.894930 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-combined-ca-bundle\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.897536 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzhpv\" (UniqueName: \"kubernetes.io/projected/0da4cb95-6224-41e2-9adc-4d0d56a0c162-kube-api-access-gzhpv\") pod \"placement-785b58c67b-rrzfw\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:03 crc kubenswrapper[5127]: I0201 07:09:03.912775 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.075493 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c76548565-62sx9"] Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.079347 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.086164 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c76548565-62sx9"] Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.090911 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.091281 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.199488 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txrss\" (UniqueName: \"kubernetes.io/projected/5701395e-85bc-40a9-bff7-f1b452b8e187-kube-api-access-txrss\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.199544 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-config\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.199597 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-httpd-config\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.199653 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-internal-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.199680 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-combined-ca-bundle\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.199698 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-ovndb-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.199729 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-public-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.302796 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-internal-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.302855 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-combined-ca-bundle\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.302888 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-ovndb-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.302971 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-public-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.303050 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txrss\" (UniqueName: \"kubernetes.io/projected/5701395e-85bc-40a9-bff7-f1b452b8e187-kube-api-access-txrss\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.303089 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-config\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.303171 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-httpd-config\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.312351 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-httpd-config\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.312630 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-config\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.313406 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-internal-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.315563 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-ovndb-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.324203 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txrss\" (UniqueName: \"kubernetes.io/projected/5701395e-85bc-40a9-bff7-f1b452b8e187-kube-api-access-txrss\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.324270 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-public-tls-certs\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.325152 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-combined-ca-bundle\") pod \"neutron-5c76548565-62sx9\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.427236 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.645012 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.645292 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.660779 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.666667 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.696554 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.707497 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.711635 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-config-data\") pod \"ee5d487e-8c6a-431b-b720-4b242eec1c40\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.711709 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zx6jc\" (UniqueName: \"kubernetes.io/projected/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-kube-api-access-zx6jc\") pod \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.711809 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-credential-keys\") pod \"ee5d487e-8c6a-431b-b720-4b242eec1c40\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.711829 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-combined-ca-bundle\") pod \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.711874 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-fernet-keys\") pod \"ee5d487e-8c6a-431b-b720-4b242eec1c40\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.711915 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-combined-ca-bundle\") pod \"ee5d487e-8c6a-431b-b720-4b242eec1c40\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.718983 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqbqm\" (UniqueName: \"kubernetes.io/projected/ee5d487e-8c6a-431b-b720-4b242eec1c40-kube-api-access-vqbqm\") pod \"ee5d487e-8c6a-431b-b720-4b242eec1c40\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.719191 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-db-sync-config-data\") pod \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\" (UID: \"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.719265 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-scripts\") pod \"ee5d487e-8c6a-431b-b720-4b242eec1c40\" (UID: \"ee5d487e-8c6a-431b-b720-4b242eec1c40\") " Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.721190 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ee5d487e-8c6a-431b-b720-4b242eec1c40" (UID: "ee5d487e-8c6a-431b-b720-4b242eec1c40"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.722069 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-kube-api-access-zx6jc" (OuterVolumeSpecName: "kube-api-access-zx6jc") pod "7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" (UID: "7692c2d1-b96e-4d2d-b0b8-039a5125c9b8"). InnerVolumeSpecName "kube-api-access-zx6jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.724966 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5d487e-8c6a-431b-b720-4b242eec1c40-kube-api-access-vqbqm" (OuterVolumeSpecName: "kube-api-access-vqbqm") pod "ee5d487e-8c6a-431b-b720-4b242eec1c40" (UID: "ee5d487e-8c6a-431b-b720-4b242eec1c40"). InnerVolumeSpecName "kube-api-access-vqbqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.727696 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" (UID: "7692c2d1-b96e-4d2d-b0b8-039a5125c9b8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.730468 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ee5d487e-8c6a-431b-b720-4b242eec1c40" (UID: "ee5d487e-8c6a-431b-b720-4b242eec1c40"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.739913 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-scripts" (OuterVolumeSpecName: "scripts") pod "ee5d487e-8c6a-431b-b720-4b242eec1c40" (UID: "ee5d487e-8c6a-431b-b720-4b242eec1c40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.766787 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-config-data" (OuterVolumeSpecName: "config-data") pod "ee5d487e-8c6a-431b-b720-4b242eec1c40" (UID: "ee5d487e-8c6a-431b-b720-4b242eec1c40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.787344 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" (UID: "7692c2d1-b96e-4d2d-b0b8-039a5125c9b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.788918 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee5d487e-8c6a-431b-b720-4b242eec1c40" (UID: "ee5d487e-8c6a-431b-b720-4b242eec1c40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822715 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822748 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822758 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zx6jc\" (UniqueName: \"kubernetes.io/projected/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-kube-api-access-zx6jc\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822769 5127 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822778 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822787 5127 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822794 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5d487e-8c6a-431b-b720-4b242eec1c40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822803 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqbqm\" (UniqueName: \"kubernetes.io/projected/ee5d487e-8c6a-431b-b720-4b242eec1c40-kube-api-access-vqbqm\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:04 crc kubenswrapper[5127]: I0201 07:09:04.822812 5127 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.253190 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-769f857fd8-mc6lf"] Feb 01 07:09:05 crc kubenswrapper[5127]: W0201 07:09:05.269760 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode87ee524_fbce_45ca_b3fb_e6b59a739f73.slice/crio-56f6ff2e1bf999e0c3761e33c152589bf39514fef880cd4270f186f01001aa6b WatchSource:0}: Error finding container 56f6ff2e1bf999e0c3761e33c152589bf39514fef880cd4270f186f01001aa6b: Status 404 returned error can't find the container with id 56f6ff2e1bf999e0c3761e33c152589bf39514fef880cd4270f186f01001aa6b Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.288344 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-769f857fd8-mc6lf" event={"ID":"e87ee524-fbce-45ca-b3fb-e6b59a739f73","Type":"ContainerStarted","Data":"56f6ff2e1bf999e0c3761e33c152589bf39514fef880cd4270f186f01001aa6b"} Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.294671 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-mxrjb" event={"ID":"7692c2d1-b96e-4d2d-b0b8-039a5125c9b8","Type":"ContainerDied","Data":"be8b2fafef62394ca286e80b2b10a05594b68160bea3ed7da88db6a3a238b33c"} Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.294714 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8b2fafef62394ca286e80b2b10a05594b68160bea3ed7da88db6a3a238b33c" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.295306 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-mxrjb" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.297722 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-wcxl4"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.308661 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z8dx7" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.310271 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z8dx7" event={"ID":"ee5d487e-8c6a-431b-b720-4b242eec1c40","Type":"ContainerDied","Data":"76253a2b48c122d66edec39139a8dfe973f0bbe59a76f7841445005a5c96ac08"} Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.310327 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76253a2b48c122d66edec39139a8dfe973f0bbe59a76f7841445005a5c96ac08" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.310849 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-785b58c67b-rrzfw"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.313507 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerStarted","Data":"1a00688a65ac59b4c19a05a44e0462a0f9a26a6480040c8e5c8fa08f5e4560c3"} Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.313538 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.313619 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.478384 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6b9c8775f7-zkz2s"] Feb 01 07:09:05 crc kubenswrapper[5127]: E0201 07:09:05.488515 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5d487e-8c6a-431b-b720-4b242eec1c40" containerName="keystone-bootstrap" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.493765 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5d487e-8c6a-431b-b720-4b242eec1c40" containerName="keystone-bootstrap" Feb 01 07:09:05 crc kubenswrapper[5127]: E0201 07:09:05.494014 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" containerName="barbican-db-sync" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.494091 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" containerName="barbican-db-sync" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.494514 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5d487e-8c6a-431b-b720-4b242eec1c40" containerName="keystone-bootstrap" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.494596 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" containerName="barbican-db-sync" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.495498 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.500498 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.500784 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.500905 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zl5lx" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.510947 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c76548565-62sx9"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.661870 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b9c8775f7-zkz2s"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.681824 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-logs\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.681899 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.682068 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data-custom\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.682096 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nh29\" (UniqueName: \"kubernetes.io/projected/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-kube-api-access-4nh29\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.682177 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-combined-ca-bundle\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.685911 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-b9d896d98-9c696"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.688263 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.692740 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.703118 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b9d896d98-9c696"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.715240 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-wcxl4"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.730522 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-jvrtd"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.732909 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.742650 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-jvrtd"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.775885 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-fcd87c97d-l2btf"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.777511 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.781892 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.783657 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data-custom\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.783790 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.783873 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.783976 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rvt\" (UniqueName: \"kubernetes.io/projected/d876e519-4139-4791-a0a7-bb9878e91a72-kube-api-access-v4rvt\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.784091 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data-custom\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.784159 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nh29\" (UniqueName: \"kubernetes.io/projected/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-kube-api-access-4nh29\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.784247 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-combined-ca-bundle\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.784327 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-combined-ca-bundle\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.784407 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d876e519-4139-4791-a0a7-bb9878e91a72-logs\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.784513 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-logs\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.785044 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-logs\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.785151 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-fcd87c97d-l2btf"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.789834 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data-custom\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.809497 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.812704 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-combined-ca-bundle\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.827029 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nh29\" (UniqueName: \"kubernetes.io/projected/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-kube-api-access-4nh29\") pod \"barbican-worker-6b9c8775f7-zkz2s\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.852381 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5fdd8b75cb-lhmbf"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.853767 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.861550 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fdd8b75cb-lhmbf"] Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.870840 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.870874 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.870966 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.872972 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-7vjtc" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.873242 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.876222 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.889929 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.889971 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890000 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-combined-ca-bundle\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890018 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-config\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890041 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-combined-ca-bundle\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890068 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890095 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d876e519-4139-4791-a0a7-bb9878e91a72-logs\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890123 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c77e93a-7960-4176-a1a9-907b8118f7a4-logs\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890151 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890172 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4662\" (UniqueName: \"kubernetes.io/projected/8baadd78-4c6f-4299-bb05-588666f19720-kube-api-access-h4662\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890190 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data-custom\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890211 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data-custom\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890229 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890247 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdntb\" (UniqueName: \"kubernetes.io/projected/1c77e93a-7960-4176-a1a9-907b8118f7a4-kube-api-access-fdntb\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890269 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.890285 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rvt\" (UniqueName: \"kubernetes.io/projected/d876e519-4139-4791-a0a7-bb9878e91a72-kube-api-access-v4rvt\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.891152 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d876e519-4139-4791-a0a7-bb9878e91a72-logs\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.908293 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.908856 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data-custom\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.910051 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-combined-ca-bundle\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.915905 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rvt\" (UniqueName: \"kubernetes.io/projected/d876e519-4139-4791-a0a7-bb9878e91a72-kube-api-access-v4rvt\") pod \"barbican-keystone-listener-b9d896d98-9c696\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:05 crc kubenswrapper[5127]: I0201 07:09:05.929967 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.020020 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021164 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4662\" (UniqueName: \"kubernetes.io/projected/8baadd78-4c6f-4299-bb05-588666f19720-kube-api-access-h4662\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021196 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data-custom\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021222 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021245 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdntb\" (UniqueName: \"kubernetes.io/projected/1c77e93a-7960-4176-a1a9-907b8118f7a4-kube-api-access-fdntb\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021271 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-internal-tls-certs\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021321 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021338 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021355 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-fernet-keys\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021390 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-combined-ca-bundle\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021413 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-config\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021448 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-public-tls-certs\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021468 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021489 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-combined-ca-bundle\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021515 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-config-data\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021533 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849qx\" (UniqueName: \"kubernetes.io/projected/adddcef2-e42a-4f9c-a1c9-08b8253e7616-kube-api-access-849qx\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021557 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c77e93a-7960-4176-a1a9-907b8118f7a4-logs\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021571 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-credential-keys\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021621 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-scripts\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.021640 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.022440 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.023628 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-config\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.024194 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.025686 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.026293 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.032652 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c77e93a-7960-4176-a1a9-907b8118f7a4-logs\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.046605 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6b5bcb8846-2gxlg"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.047960 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.056425 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.057117 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data-custom\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.057235 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-combined-ca-bundle\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.066890 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4662\" (UniqueName: \"kubernetes.io/projected/8baadd78-4c6f-4299-bb05-588666f19720-kube-api-access-h4662\") pod \"dnsmasq-dns-7bdf86f46f-jvrtd\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.080158 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.086395 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdntb\" (UniqueName: \"kubernetes.io/projected/1c77e93a-7960-4176-a1a9-907b8118f7a4-kube-api-access-fdntb\") pod \"barbican-api-fcd87c97d-l2btf\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.086493 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7698d9bdb9-bwmxd"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.087876 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.124765 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-fernet-keys\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.124837 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-public-tls-certs\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.124868 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-combined-ca-bundle\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.124897 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-config-data\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.124921 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-849qx\" (UniqueName: \"kubernetes.io/projected/adddcef2-e42a-4f9c-a1c9-08b8253e7616-kube-api-access-849qx\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.124942 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-credential-keys\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.124966 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-scripts\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.125013 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-internal-tls-certs\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.137568 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-combined-ca-bundle\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.138054 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.140392 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b5bcb8846-2gxlg"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.147934 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-scripts\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.163676 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-internal-tls-certs\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.164126 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-public-tls-certs\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.164134 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-config-data\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.164413 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-fernet-keys\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.171748 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-credential-keys\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.186337 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7698d9bdb9-bwmxd"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.200650 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-849qx\" (UniqueName: \"kubernetes.io/projected/adddcef2-e42a-4f9c-a1c9-08b8253e7616-kube-api-access-849qx\") pod \"keystone-5fdd8b75cb-lhmbf\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226058 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data-custom\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226095 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226123 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-logs\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226146 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226314 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s68jl\" (UniqueName: \"kubernetes.io/projected/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-kube-api-access-s68jl\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226398 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a6e525-1342-4031-8c3d-5920b8016c8e-logs\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226425 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-combined-ca-bundle\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226491 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data-custom\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226546 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qclqx\" (UniqueName: \"kubernetes.io/projected/b8a6e525-1342-4031-8c3d-5920b8016c8e-kube-api-access-qclqx\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.226701 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.248118 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.278180 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56d87964d8-rmv9v"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.286617 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56d87964d8-rmv9v"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.287494 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-769f857fd8-mc6lf"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.287346 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330605 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data-custom\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330651 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330679 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-logs\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330701 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330757 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s68jl\" (UniqueName: \"kubernetes.io/projected/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-kube-api-access-s68jl\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330790 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a6e525-1342-4031-8c3d-5920b8016c8e-logs\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330807 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-combined-ca-bundle\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330835 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data-custom\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330860 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qclqx\" (UniqueName: \"kubernetes.io/projected/b8a6e525-1342-4031-8c3d-5920b8016c8e-kube-api-access-qclqx\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.330914 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.335236 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a6e525-1342-4031-8c3d-5920b8016c8e-logs\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.343675 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7bcb954fdc-q646r"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.345062 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.345164 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.346446 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-logs\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.347359 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-combined-ca-bundle\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.351068 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-combined-ca-bundle\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.352289 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data-custom\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.356787 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.365254 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68jl\" (UniqueName: \"kubernetes.io/projected/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-kube-api-access-s68jl\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.370216 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data-custom\") pod \"barbican-worker-7698d9bdb9-bwmxd\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.370992 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bcb954fdc-q646r"] Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.377328 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qclqx\" (UniqueName: \"kubernetes.io/projected/b8a6e525-1342-4031-8c3d-5920b8016c8e-kube-api-access-qclqx\") pod \"barbican-keystone-listener-6b5bcb8846-2gxlg\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.389096 5127 generic.go:334] "Generic (PLEG): container finished" podID="dc932a0b-f98d-4426-85d6-493c51f87a39" containerID="b5ad36da14aa49f0c289fd664bfe924f8b83d9894729487e9da0fdae6ee61006" exitCode=0 Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.389327 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" event={"ID":"dc932a0b-f98d-4426-85d6-493c51f87a39","Type":"ContainerDied","Data":"b5ad36da14aa49f0c289fd664bfe924f8b83d9894729487e9da0fdae6ee61006"} Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.389378 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" event={"ID":"dc932a0b-f98d-4426-85d6-493c51f87a39","Type":"ContainerStarted","Data":"1c13e1b82c9ebd33bcb5c0633a86ca14d3499679b5e78c335c8b599080eeb4a2"} Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.407465 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c76548565-62sx9" event={"ID":"5701395e-85bc-40a9-bff7-f1b452b8e187","Type":"ContainerStarted","Data":"eef349efd9f8afc4c6c60ee85d4c4d3831137e5254d4a5aea09e472c2aed68f6"} Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.422432 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-769f857fd8-mc6lf" event={"ID":"e87ee524-fbce-45ca-b3fb-e6b59a739f73","Type":"ContainerStarted","Data":"c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4"} Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.428953 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-785b58c67b-rrzfw" event={"ID":"0da4cb95-6224-41e2-9adc-4d0d56a0c162","Type":"ContainerStarted","Data":"d9d494912fd4bed9d4a8e96ace41f4839e6099c8b86355e161b702c93bc5920a"} Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.428989 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-785b58c67b-rrzfw" event={"ID":"0da4cb95-6224-41e2-9adc-4d0d56a0c162","Type":"ContainerStarted","Data":"9c39754222c5e421fee197611eeee3c1dfecc6368f341ae2cba990656bd7fdea"} Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.432708 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj6ts\" (UniqueName: \"kubernetes.io/projected/a2063c29-7f15-4f1d-a669-c3d2a303bc57-kube-api-access-bj6ts\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.432784 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data-custom\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.432864 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2063c29-7f15-4f1d-a669-c3d2a303bc57-logs\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.433012 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-combined-ca-bundle\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.433042 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.459573 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.460494 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.514344 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542307 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-ovndb-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542391 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-combined-ca-bundle\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542439 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-config\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542473 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvkw9\" (UniqueName: \"kubernetes.io/projected/a63dd2b1-3f35-45bf-8e69-170e3e980eac-kube-api-access-hvkw9\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542493 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-public-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542534 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-internal-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542573 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-combined-ca-bundle\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542604 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-httpd-config\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542622 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.542666 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj6ts\" (UniqueName: \"kubernetes.io/projected/a2063c29-7f15-4f1d-a669-c3d2a303bc57-kube-api-access-bj6ts\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.544178 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data-custom\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.544238 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2063c29-7f15-4f1d-a669-c3d2a303bc57-logs\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.544816 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.545219 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2063c29-7f15-4f1d-a669-c3d2a303bc57-logs\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.553195 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data-custom\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.553271 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.565947 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-combined-ca-bundle\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.593866 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj6ts\" (UniqueName: \"kubernetes.io/projected/a2063c29-7f15-4f1d-a669-c3d2a303bc57-kube-api-access-bj6ts\") pod \"barbican-api-56d87964d8-rmv9v\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.600844 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.613374 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.627460 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.646914 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-ovndb-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.646995 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-combined-ca-bundle\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.647041 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-config\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.647083 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvkw9\" (UniqueName: \"kubernetes.io/projected/a63dd2b1-3f35-45bf-8e69-170e3e980eac-kube-api-access-hvkw9\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.647104 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-public-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.647128 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-internal-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.647179 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-httpd-config\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.651466 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-ovndb-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.652644 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-internal-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.653293 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-config\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.653520 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-httpd-config\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.660341 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-public-tls-certs\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.660521 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-combined-ca-bundle\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.663911 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvkw9\" (UniqueName: \"kubernetes.io/projected/a63dd2b1-3f35-45bf-8e69-170e3e980eac-kube-api-access-hvkw9\") pod \"neutron-7bcb954fdc-q646r\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.694014 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:06 crc kubenswrapper[5127]: I0201 07:09:06.703496 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6b9c8775f7-zkz2s"] Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.002374 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-jvrtd"] Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.032657 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5fdd8b75cb-lhmbf"] Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.050434 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b9d896d98-9c696"] Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.261229 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-fcd87c97d-l2btf"] Feb 01 07:09:07 crc kubenswrapper[5127]: W0201 07:09:07.381633 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c77e93a_7960_4176_a1a9_907b8118f7a4.slice/crio-a4d573bc858587ab8fe434c0f5629d25a372d3e0a4eb3674db298800da75bce8 WatchSource:0}: Error finding container a4d573bc858587ab8fe434c0f5629d25a372d3e0a4eb3674db298800da75bce8: Status 404 returned error can't find the container with id a4d573bc858587ab8fe434c0f5629d25a372d3e0a4eb3674db298800da75bce8 Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.460288 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" event={"ID":"d876e519-4139-4791-a0a7-bb9878e91a72","Type":"ContainerStarted","Data":"9b91679550b2b947c80e3ba413f848575dd11ae15b1e592730c97a168a97b901"} Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.490149 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcd87c97d-l2btf" event={"ID":"1c77e93a-7960-4176-a1a9-907b8118f7a4","Type":"ContainerStarted","Data":"a4d573bc858587ab8fe434c0f5629d25a372d3e0a4eb3674db298800da75bce8"} Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.493044 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" event={"ID":"8baadd78-4c6f-4299-bb05-588666f19720","Type":"ContainerStarted","Data":"f4b6f5b09fbc02ebeff21c32e8b0d72f0cfe6cb6b99ca7958d0660ae2c5f66c1"} Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.494989 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdd8b75cb-lhmbf" event={"ID":"adddcef2-e42a-4f9c-a1c9-08b8253e7616","Type":"ContainerStarted","Data":"8abf53608175bed6672bf9502810cce90ca07b7bbb9ba0772682409192dff0d2"} Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.497602 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" event={"ID":"dc932a0b-f98d-4426-85d6-493c51f87a39","Type":"ContainerDied","Data":"1c13e1b82c9ebd33bcb5c0633a86ca14d3499679b5e78c335c8b599080eeb4a2"} Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.497630 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c13e1b82c9ebd33bcb5c0633a86ca14d3499679b5e78c335c8b599080eeb4a2" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.501927 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.501943 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.501935 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" event={"ID":"bcd541f4-33fd-42c1-a5af-5b9b1ddee054","Type":"ContainerStarted","Data":"e19f1716ecd2b0c87f65a556845252df1fb29de1fbbf838f3f79a2a579069954"} Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.503913 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.503934 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.551574 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.665357 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-svc\") pod \"dc932a0b-f98d-4426-85d6-493c51f87a39\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.665747 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-sb\") pod \"dc932a0b-f98d-4426-85d6-493c51f87a39\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.665775 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-swift-storage-0\") pod \"dc932a0b-f98d-4426-85d6-493c51f87a39\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.665890 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d55g\" (UniqueName: \"kubernetes.io/projected/dc932a0b-f98d-4426-85d6-493c51f87a39-kube-api-access-9d55g\") pod \"dc932a0b-f98d-4426-85d6-493c51f87a39\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.665915 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-config\") pod \"dc932a0b-f98d-4426-85d6-493c51f87a39\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.665964 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-nb\") pod \"dc932a0b-f98d-4426-85d6-493c51f87a39\" (UID: \"dc932a0b-f98d-4426-85d6-493c51f87a39\") " Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.680515 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc932a0b-f98d-4426-85d6-493c51f87a39-kube-api-access-9d55g" (OuterVolumeSpecName: "kube-api-access-9d55g") pod "dc932a0b-f98d-4426-85d6-493c51f87a39" (UID: "dc932a0b-f98d-4426-85d6-493c51f87a39"). InnerVolumeSpecName "kube-api-access-9d55g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.768623 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d55g\" (UniqueName: \"kubernetes.io/projected/dc932a0b-f98d-4426-85d6-493c51f87a39-kube-api-access-9d55g\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.922703 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dc932a0b-f98d-4426-85d6-493c51f87a39" (UID: "dc932a0b-f98d-4426-85d6-493c51f87a39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.961971 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dc932a0b-f98d-4426-85d6-493c51f87a39" (UID: "dc932a0b-f98d-4426-85d6-493c51f87a39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.973763 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:07 crc kubenswrapper[5127]: I0201 07:09:07.973791 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.057184 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6b5bcb8846-2gxlg"] Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.086122 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "dc932a0b-f98d-4426-85d6-493c51f87a39" (UID: "dc932a0b-f98d-4426-85d6-493c51f87a39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.099786 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dc932a0b-f98d-4426-85d6-493c51f87a39" (UID: "dc932a0b-f98d-4426-85d6-493c51f87a39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.160371 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-config" (OuterVolumeSpecName: "config") pod "dc932a0b-f98d-4426-85d6-493c51f87a39" (UID: "dc932a0b-f98d-4426-85d6-493c51f87a39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.181249 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.181293 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.181304 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dc932a0b-f98d-4426-85d6-493c51f87a39-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.466624 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7698d9bdb9-bwmxd"] Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.485196 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56d87964d8-rmv9v"] Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.523788 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6fbd756774-8bz24"] Feb 01 07:09:08 crc kubenswrapper[5127]: E0201 07:09:08.524185 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc932a0b-f98d-4426-85d6-493c51f87a39" containerName="init" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.524197 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc932a0b-f98d-4426-85d6-493c51f87a39" containerName="init" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.524531 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc932a0b-f98d-4426-85d6-493c51f87a39" containerName="init" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.525790 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.546854 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fbd756774-8bz24"] Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.627975 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcd87c97d-l2btf" event={"ID":"1c77e93a-7960-4176-a1a9-907b8118f7a4","Type":"ContainerStarted","Data":"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa"} Feb 01 07:09:08 crc kubenswrapper[5127]: W0201 07:09:08.628086 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf17b5eda_d0f1_4e8d_a807_cf1a0bb2928a.slice/crio-717aaaf2700890ab7416f23dfc7faa2c3ae9fd0caa185fb7450b34959c3fc613 WatchSource:0}: Error finding container 717aaaf2700890ab7416f23dfc7faa2c3ae9fd0caa185fb7450b34959c3fc613: Status 404 returned error can't find the container with id 717aaaf2700890ab7416f23dfc7faa2c3ae9fd0caa185fb7450b34959c3fc613 Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.671278 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.671333 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" event={"ID":"b8a6e525-1342-4031-8c3d-5920b8016c8e","Type":"ContainerStarted","Data":"c43bdf483ccdb71e73a52ef433ef6721b305b568dabd90800a9967a4bf4cc820"} Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.682918 5127 generic.go:334] "Generic (PLEG): container finished" podID="8baadd78-4c6f-4299-bb05-588666f19720" containerID="f6b41e6f76c2670507f1f7149418ea3325709816d5e417716494a1687ad34313" exitCode=0 Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.683015 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" event={"ID":"8baadd78-4c6f-4299-bb05-588666f19720","Type":"ContainerDied","Data":"f6b41e6f76c2670507f1f7149418ea3325709816d5e417716494a1687ad34313"} Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.707216 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-769f857fd8-mc6lf" event={"ID":"e87ee524-fbce-45ca-b3fb-e6b59a739f73","Type":"ContainerStarted","Data":"dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa"} Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.707528 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-769f857fd8-mc6lf" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-api" containerID="cri-o://c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4" gracePeriod=30 Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.707685 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.707764 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-769f857fd8-mc6lf" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-httpd" containerID="cri-o://dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa" gracePeriod=30 Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.730835 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-public-tls-certs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.730904 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-scripts\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.730927 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbzch\" (UniqueName: \"kubernetes.io/projected/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-kube-api-access-lbzch\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.730969 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-config-data\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.732314 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-combined-ca-bundle\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.732367 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-logs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.732426 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-internal-tls-certs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.752942 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdd8b75cb-lhmbf" event={"ID":"adddcef2-e42a-4f9c-a1c9-08b8253e7616","Type":"ContainerStarted","Data":"40b690d53e4e14c7eb51d61afb6d0b0437739a3d5946e3586f5c4d6026b0819a"} Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.769184 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-785b58c67b-rrzfw" event={"ID":"0da4cb95-6224-41e2-9adc-4d0d56a0c162","Type":"ContainerStarted","Data":"59caa9defd7b25237630d39d27038f3e0a8a5e123e7f87d19dcdcc603c61f215"} Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.769353 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.776730 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7bcb954fdc-q646r"] Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.796693 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-wcxl4" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.797300 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c76548565-62sx9" event={"ID":"5701395e-85bc-40a9-bff7-f1b452b8e187","Type":"ContainerStarted","Data":"10b465e660e4f2883032c090226b977689b04511bf7c1d7ab8b7d44ff5bb1e77"} Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.797369 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.800242 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.836615 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-public-tls-certs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.836674 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-scripts\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.836691 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbzch\" (UniqueName: \"kubernetes.io/projected/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-kube-api-access-lbzch\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.836731 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-config-data\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.836809 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-combined-ca-bundle\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.836831 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-logs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.836872 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-internal-tls-certs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.841959 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-logs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.853105 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-public-tls-certs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.856221 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-internal-tls-certs\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.864289 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-combined-ca-bundle\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.880930 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-scripts\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.884953 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-config-data\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:08 crc kubenswrapper[5127]: I0201 07:09:08.891052 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbzch\" (UniqueName: \"kubernetes.io/projected/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-kube-api-access-lbzch\") pod \"placement-6fbd756774-8bz24\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.009105 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-769f857fd8-mc6lf" podStartSLOduration=8.009084359 podStartE2EDuration="8.009084359s" podCreationTimestamp="2026-02-01 07:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:08.756779129 +0000 UTC m=+1299.242681512" watchObservedRunningTime="2026-02-01 07:09:09.009084359 +0000 UTC m=+1299.494986722" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.036036 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5fdd8b75cb-lhmbf" podStartSLOduration=4.036016786 podStartE2EDuration="4.036016786s" podCreationTimestamp="2026-02-01 07:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:08.777605402 +0000 UTC m=+1299.263507765" watchObservedRunningTime="2026-02-01 07:09:09.036016786 +0000 UTC m=+1299.521919149" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.062251 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-785b58c67b-rrzfw" podStartSLOduration=6.062232306 podStartE2EDuration="6.062232306s" podCreationTimestamp="2026-02-01 07:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:08.819252308 +0000 UTC m=+1299.305154671" watchObservedRunningTime="2026-02-01 07:09:09.062232306 +0000 UTC m=+1299.548134659" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.187034 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.187121 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-wcxl4"] Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.221256 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-wcxl4"] Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.222901 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c76548565-62sx9" podStartSLOduration=6.222890488 podStartE2EDuration="6.222890488s" podCreationTimestamp="2026-02-01 07:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:08.889960489 +0000 UTC m=+1299.375862852" watchObservedRunningTime="2026-02-01 07:09:09.222890488 +0000 UTC m=+1299.708792851" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.342767 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.825986 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d87964d8-rmv9v" event={"ID":"a2063c29-7f15-4f1d-a669-c3d2a303bc57","Type":"ContainerStarted","Data":"2595d723318172e4cd538176cf76ad6898d913735e3ef93149f84fa24866fc73"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.826552 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d87964d8-rmv9v" event={"ID":"a2063c29-7f15-4f1d-a669-c3d2a303bc57","Type":"ContainerStarted","Data":"4e6a99faf8229c47fafa147a15596fcc59573ee65e2838ff527c7f0b7195093b"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.847328 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c76548565-62sx9" event={"ID":"5701395e-85bc-40a9-bff7-f1b452b8e187","Type":"ContainerStarted","Data":"f754d2dd9af31fff605fb93bb5e24240954c7191bee15e280e72a4fafc44bfb2"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.850514 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6fbd756774-8bz24"] Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.851199 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcd87c97d-l2btf" event={"ID":"1c77e93a-7960-4176-a1a9-907b8118f7a4","Type":"ContainerStarted","Data":"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.851279 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.851303 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.859486 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" event={"ID":"8baadd78-4c6f-4299-bb05-588666f19720","Type":"ContainerStarted","Data":"e5903e7f1235527332e8b7dd44c14a8ea5204e3d249efe266568d10d0397c7f2"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.860252 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.885875 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-fcd87c97d-l2btf" podStartSLOduration=4.885854448 podStartE2EDuration="4.885854448s" podCreationTimestamp="2026-02-01 07:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:09.869790403 +0000 UTC m=+1300.355692756" watchObservedRunningTime="2026-02-01 07:09:09.885854448 +0000 UTC m=+1300.371756811" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.886881 5127 generic.go:334] "Generic (PLEG): container finished" podID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerID="dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa" exitCode=0 Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.886973 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-769f857fd8-mc6lf" event={"ID":"e87ee524-fbce-45ca-b3fb-e6b59a739f73","Type":"ContainerDied","Data":"dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.894549 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcb954fdc-q646r" event={"ID":"a63dd2b1-3f35-45bf-8e69-170e3e980eac","Type":"ContainerStarted","Data":"3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.894629 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcb954fdc-q646r" event={"ID":"a63dd2b1-3f35-45bf-8e69-170e3e980eac","Type":"ContainerStarted","Data":"de394e66fc791cb4226c4b103b88e2e418c115f640034c3a5719723870a2e0d7"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.904104 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.904127 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.904211 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" event={"ID":"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a","Type":"ContainerStarted","Data":"717aaaf2700890ab7416f23dfc7faa2c3ae9fd0caa185fb7450b34959c3fc613"} Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.904905 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.904934 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:09 crc kubenswrapper[5127]: I0201 07:09:09.958562 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" podStartSLOduration=4.958540713 podStartE2EDuration="4.958540713s" podCreationTimestamp="2026-02-01 07:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:09.896047773 +0000 UTC m=+1300.381950136" watchObservedRunningTime="2026-02-01 07:09:09.958540713 +0000 UTC m=+1300.444443086" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.308926 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc932a0b-f98d-4426-85d6-493c51f87a39" path="/var/lib/kubelet/pods/dc932a0b-f98d-4426-85d6-493c51f87a39/volumes" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.409680 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-fcd87c97d-l2btf"] Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.449645 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6969499d9b-sjxsr"] Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.451463 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.456223 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.456630 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.489700 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6969499d9b-sjxsr"] Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.603092 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.603411 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-public-tls-certs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.603714 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-internal-tls-certs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.603893 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-combined-ca-bundle\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.603968 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data-custom\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.604113 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472be6e7-d046-4377-b055-50828b00b8cd-logs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.604133 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbmdv\" (UniqueName: \"kubernetes.io/projected/472be6e7-d046-4377-b055-50828b00b8cd-kube-api-access-zbmdv\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.707507 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-combined-ca-bundle\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.708064 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data-custom\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.708156 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472be6e7-d046-4377-b055-50828b00b8cd-logs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.708229 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbmdv\" (UniqueName: \"kubernetes.io/projected/472be6e7-d046-4377-b055-50828b00b8cd-kube-api-access-zbmdv\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.708319 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.708399 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-public-tls-certs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.708493 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-internal-tls-certs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.709928 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472be6e7-d046-4377-b055-50828b00b8cd-logs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.737393 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data-custom\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.749244 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-public-tls-certs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.749253 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbmdv\" (UniqueName: \"kubernetes.io/projected/472be6e7-d046-4377-b055-50828b00b8cd-kube-api-access-zbmdv\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.749778 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-internal-tls-certs\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.750078 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-combined-ca-bundle\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.752910 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data\") pod \"barbican-api-6969499d9b-sjxsr\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.771921 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.862286 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.874572 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.958688 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d87964d8-rmv9v" event={"ID":"a2063c29-7f15-4f1d-a669-c3d2a303bc57","Type":"ContainerStarted","Data":"0ccf2b7841327abd2475fd89b38cce540d58bccf850d4bc3cdc5becd6eb10e22"} Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.959945 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.959991 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.963230 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcb954fdc-q646r" event={"ID":"a63dd2b1-3f35-45bf-8e69-170e3e980eac","Type":"ContainerStarted","Data":"1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae"} Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.964805 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.967175 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fbd756774-8bz24" event={"ID":"79f921c6-ec0a-46f5-b3c3-5d479690d0e5","Type":"ContainerStarted","Data":"bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5"} Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.967200 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fbd756774-8bz24" event={"ID":"79f921c6-ec0a-46f5-b3c3-5d479690d0e5","Type":"ContainerStarted","Data":"280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f"} Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.967213 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.967221 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fbd756774-8bz24" event={"ID":"79f921c6-ec0a-46f5-b3c3-5d479690d0e5","Type":"ContainerStarted","Data":"a7789c4cd775313f628a5126de285bf080b0610efdda813fd7d4bd315bc73b60"} Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.978800 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:10 crc kubenswrapper[5127]: I0201 07:09:10.985457 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56d87964d8-rmv9v" podStartSLOduration=4.985438429 podStartE2EDuration="4.985438429s" podCreationTimestamp="2026-02-01 07:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:10.975528721 +0000 UTC m=+1301.461431084" watchObservedRunningTime="2026-02-01 07:09:10.985438429 +0000 UTC m=+1301.471340792" Feb 01 07:09:11 crc kubenswrapper[5127]: I0201 07:09:11.002566 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7bcb954fdc-q646r" podStartSLOduration=5.002546751 podStartE2EDuration="5.002546751s" podCreationTimestamp="2026-02-01 07:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:10.997829724 +0000 UTC m=+1301.483732087" watchObservedRunningTime="2026-02-01 07:09:11.002546751 +0000 UTC m=+1301.488449114" Feb 01 07:09:11 crc kubenswrapper[5127]: I0201 07:09:11.043520 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6fbd756774-8bz24" podStartSLOduration=3.043501678 podStartE2EDuration="3.043501678s" podCreationTimestamp="2026-02-01 07:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:11.022572883 +0000 UTC m=+1301.508475246" watchObservedRunningTime="2026-02-01 07:09:11.043501678 +0000 UTC m=+1301.529404041" Feb 01 07:09:11 crc kubenswrapper[5127]: I0201 07:09:11.976593 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-fcd87c97d-l2btf" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api-log" containerID="cri-o://9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa" gracePeriod=30 Feb 01 07:09:11 crc kubenswrapper[5127]: I0201 07:09:11.977175 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-fcd87c97d-l2btf" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api" containerID="cri-o://489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065" gracePeriod=30 Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.441115 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.644268 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.752136 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdntb\" (UniqueName: \"kubernetes.io/projected/1c77e93a-7960-4176-a1a9-907b8118f7a4-kube-api-access-fdntb\") pod \"1c77e93a-7960-4176-a1a9-907b8118f7a4\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.752304 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-combined-ca-bundle\") pod \"1c77e93a-7960-4176-a1a9-907b8118f7a4\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.752376 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c77e93a-7960-4176-a1a9-907b8118f7a4-logs\") pod \"1c77e93a-7960-4176-a1a9-907b8118f7a4\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.752430 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data-custom\") pod \"1c77e93a-7960-4176-a1a9-907b8118f7a4\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.752499 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data\") pod \"1c77e93a-7960-4176-a1a9-907b8118f7a4\" (UID: \"1c77e93a-7960-4176-a1a9-907b8118f7a4\") " Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.754505 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c77e93a-7960-4176-a1a9-907b8118f7a4-logs" (OuterVolumeSpecName: "logs") pod "1c77e93a-7960-4176-a1a9-907b8118f7a4" (UID: "1c77e93a-7960-4176-a1a9-907b8118f7a4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.763575 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c77e93a-7960-4176-a1a9-907b8118f7a4-kube-api-access-fdntb" (OuterVolumeSpecName: "kube-api-access-fdntb") pod "1c77e93a-7960-4176-a1a9-907b8118f7a4" (UID: "1c77e93a-7960-4176-a1a9-907b8118f7a4"). InnerVolumeSpecName "kube-api-access-fdntb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.782544 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c77e93a-7960-4176-a1a9-907b8118f7a4" (UID: "1c77e93a-7960-4176-a1a9-907b8118f7a4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.796890 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6969499d9b-sjxsr"] Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.825970 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c77e93a-7960-4176-a1a9-907b8118f7a4" (UID: "1c77e93a-7960-4176-a1a9-907b8118f7a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.847728 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data" (OuterVolumeSpecName: "config-data") pod "1c77e93a-7960-4176-a1a9-907b8118f7a4" (UID: "1c77e93a-7960-4176-a1a9-907b8118f7a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.855825 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c77e93a-7960-4176-a1a9-907b8118f7a4-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.855856 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.855869 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.855878 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdntb\" (UniqueName: \"kubernetes.io/projected/1c77e93a-7960-4176-a1a9-907b8118f7a4-kube-api-access-fdntb\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.855890 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c77e93a-7960-4176-a1a9-907b8118f7a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.991273 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" event={"ID":"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a","Type":"ContainerStarted","Data":"29d8d027dbe06246751c1b56e85016b77f2dd4ca87ded166e55fa2c4832c64ec"} Feb 01 07:09:12 crc kubenswrapper[5127]: I0201 07:09:12.991655 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" event={"ID":"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a","Type":"ContainerStarted","Data":"a3a77f3f69d363acbcf4efc5d0f20f16e293179511ce86f0cdd3c1b58066afa5"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.001266 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" event={"ID":"bcd541f4-33fd-42c1-a5af-5b9b1ddee054","Type":"ContainerStarted","Data":"8b6cf0fce9191b5f92f6dba5f32440813d779ca5c4bfc287007eb0b70522a42e"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.001438 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" event={"ID":"bcd541f4-33fd-42c1-a5af-5b9b1ddee054","Type":"ContainerStarted","Data":"7793537fc2a3ab52b59146e273aa2bff9c70fc06368a4e4572165b5882b98880"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.005274 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" event={"ID":"d876e519-4139-4791-a0a7-bb9878e91a72","Type":"ContainerStarted","Data":"ff748cfb27efa83582460b877690b1b4a2259f655935539f98a139ae9ff125cf"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.007648 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6969499d9b-sjxsr" event={"ID":"472be6e7-d046-4377-b055-50828b00b8cd","Type":"ContainerStarted","Data":"f4c7d48d77b6440e94b4192e6678a4b60093900d8ce01ef00c84f59e13525610"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.012296 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" podStartSLOduration=3.449711999 podStartE2EDuration="7.012284134s" podCreationTimestamp="2026-02-01 07:09:06 +0000 UTC" firstStartedPulling="2026-02-01 07:09:08.655328627 +0000 UTC m=+1299.141230990" lastFinishedPulling="2026-02-01 07:09:12.217900772 +0000 UTC m=+1302.703803125" observedRunningTime="2026-02-01 07:09:13.006841447 +0000 UTC m=+1303.492743810" watchObservedRunningTime="2026-02-01 07:09:13.012284134 +0000 UTC m=+1303.498186497" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.019744 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerID="489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065" exitCode=0 Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.019777 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerID="9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa" exitCode=143 Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.019847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcd87c97d-l2btf" event={"ID":"1c77e93a-7960-4176-a1a9-907b8118f7a4","Type":"ContainerDied","Data":"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.019895 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcd87c97d-l2btf" event={"ID":"1c77e93a-7960-4176-a1a9-907b8118f7a4","Type":"ContainerDied","Data":"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.019910 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-fcd87c97d-l2btf" event={"ID":"1c77e93a-7960-4176-a1a9-907b8118f7a4","Type":"ContainerDied","Data":"a4d573bc858587ab8fe434c0f5629d25a372d3e0a4eb3674db298800da75bce8"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.019929 5127 scope.go:117] "RemoveContainer" containerID="489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.020109 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-fcd87c97d-l2btf" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.023172 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" event={"ID":"b8a6e525-1342-4031-8c3d-5920b8016c8e","Type":"ContainerStarted","Data":"a3ab6404657e8a50a3cc043680876f80e95b3982cb32682933e72885d036811f"} Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.042388 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" podStartSLOduration=3.196065233 podStartE2EDuration="8.042368077s" podCreationTimestamp="2026-02-01 07:09:05 +0000 UTC" firstStartedPulling="2026-02-01 07:09:07.373097399 +0000 UTC m=+1297.858999762" lastFinishedPulling="2026-02-01 07:09:12.219400243 +0000 UTC m=+1302.705302606" observedRunningTime="2026-02-01 07:09:13.034851224 +0000 UTC m=+1303.520753587" watchObservedRunningTime="2026-02-01 07:09:13.042368077 +0000 UTC m=+1303.528270440" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.057334 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b9c8775f7-zkz2s"] Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.062102 5127 scope.go:117] "RemoveContainer" containerID="9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.066008 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" podStartSLOduration=3.223442973 podStartE2EDuration="8.065993426s" podCreationTimestamp="2026-02-01 07:09:05 +0000 UTC" firstStartedPulling="2026-02-01 07:09:07.37684555 +0000 UTC m=+1297.862747913" lastFinishedPulling="2026-02-01 07:09:12.219396003 +0000 UTC m=+1302.705298366" observedRunningTime="2026-02-01 07:09:13.057168087 +0000 UTC m=+1303.543070450" watchObservedRunningTime="2026-02-01 07:09:13.065993426 +0000 UTC m=+1303.551895789" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.088707 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-fcd87c97d-l2btf"] Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.100698 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-fcd87c97d-l2btf"] Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.118828 5127 scope.go:117] "RemoveContainer" containerID="489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065" Feb 01 07:09:13 crc kubenswrapper[5127]: E0201 07:09:13.119757 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065\": container with ID starting with 489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065 not found: ID does not exist" containerID="489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.119794 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065"} err="failed to get container status \"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065\": rpc error: code = NotFound desc = could not find container \"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065\": container with ID starting with 489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065 not found: ID does not exist" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.119819 5127 scope.go:117] "RemoveContainer" containerID="9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa" Feb 01 07:09:13 crc kubenswrapper[5127]: E0201 07:09:13.120037 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa\": container with ID starting with 9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa not found: ID does not exist" containerID="9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.120076 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa"} err="failed to get container status \"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa\": rpc error: code = NotFound desc = could not find container \"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa\": container with ID starting with 9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa not found: ID does not exist" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.120101 5127 scope.go:117] "RemoveContainer" containerID="489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.120344 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065"} err="failed to get container status \"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065\": rpc error: code = NotFound desc = could not find container \"489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065\": container with ID starting with 489076a20c451049c552754ba4e86ac01d9d52baa1e4d91cd74cbe7b553db065 not found: ID does not exist" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.120367 5127 scope.go:117] "RemoveContainer" containerID="9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.120555 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa"} err="failed to get container status \"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa\": rpc error: code = NotFound desc = could not find container \"9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa\": container with ID starting with 9bb03646ca1acf4e3a838edc58fd3b30615250e2dd0c82de3ebf05af2955cafa not found: ID does not exist" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.123120 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" podStartSLOduration=2.987751803 podStartE2EDuration="7.123098259s" podCreationTimestamp="2026-02-01 07:09:06 +0000 UTC" firstStartedPulling="2026-02-01 07:09:08.08569032 +0000 UTC m=+1298.571592683" lastFinishedPulling="2026-02-01 07:09:12.221036776 +0000 UTC m=+1302.706939139" observedRunningTime="2026-02-01 07:09:13.110994102 +0000 UTC m=+1303.596896465" watchObservedRunningTime="2026-02-01 07:09:13.123098259 +0000 UTC m=+1303.609000622" Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.162997 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-b9d896d98-9c696"] Feb 01 07:09:13 crc kubenswrapper[5127]: I0201 07:09:13.755635 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.041713 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" event={"ID":"d876e519-4139-4791-a0a7-bb9878e91a72","Type":"ContainerStarted","Data":"9cb36f2e85f77ff44283559edf3708d36ee5e3318b541875332b0b29e666852a"} Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.044834 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6969499d9b-sjxsr" event={"ID":"472be6e7-d046-4377-b055-50828b00b8cd","Type":"ContainerStarted","Data":"62c7e7aeed632c501e98dba48dbb0ca73647880b2adcc2c89f331f126d30002a"} Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.044859 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6969499d9b-sjxsr" event={"ID":"472be6e7-d046-4377-b055-50828b00b8cd","Type":"ContainerStarted","Data":"829b8906c7d8a005a0f0715b5027bcf6b0f42ef7cc11158f5d59737c1d368916"} Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.045531 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.045552 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.062559 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" event={"ID":"b8a6e525-1342-4031-8c3d-5920b8016c8e","Type":"ContainerStarted","Data":"c29366b00ecfb7dffff5a9a80692040e245c0a01c1fbbaf5d4d101f0738c006c"} Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.065004 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6969499d9b-sjxsr" podStartSLOduration=4.064985578 podStartE2EDuration="4.064985578s" podCreationTimestamp="2026-02-01 07:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:14.063799676 +0000 UTC m=+1304.549702039" watchObservedRunningTime="2026-02-01 07:09:14.064985578 +0000 UTC m=+1304.550887941" Feb 01 07:09:14 crc kubenswrapper[5127]: I0201 07:09:14.249922 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" path="/var/lib/kubelet/pods/1c77e93a-7960-4176-a1a9-907b8118f7a4/volumes" Feb 01 07:09:15 crc kubenswrapper[5127]: I0201 07:09:15.073773 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-w7586" event={"ID":"4f4d5a37-3a02-493f-9cf9-d53931c2a92b","Type":"ContainerStarted","Data":"25aceb3b56d27bad63c6a39a8a7c21031da417b927c23a72504a99b04f2dbf18"} Feb 01 07:09:15 crc kubenswrapper[5127]: I0201 07:09:15.074230 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener-log" containerID="cri-o://ff748cfb27efa83582460b877690b1b4a2259f655935539f98a139ae9ff125cf" gracePeriod=30 Feb 01 07:09:15 crc kubenswrapper[5127]: I0201 07:09:15.074413 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker-log" containerID="cri-o://7793537fc2a3ab52b59146e273aa2bff9c70fc06368a4e4572165b5882b98880" gracePeriod=30 Feb 01 07:09:15 crc kubenswrapper[5127]: I0201 07:09:15.074455 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener" containerID="cri-o://9cb36f2e85f77ff44283559edf3708d36ee5e3318b541875332b0b29e666852a" gracePeriod=30 Feb 01 07:09:15 crc kubenswrapper[5127]: I0201 07:09:15.074593 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker" containerID="cri-o://8b6cf0fce9191b5f92f6dba5f32440813d779ca5c4bfc287007eb0b70522a42e" gracePeriod=30 Feb 01 07:09:15 crc kubenswrapper[5127]: I0201 07:09:15.099762 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-w7586" podStartSLOduration=3.701775243 podStartE2EDuration="43.099740337s" podCreationTimestamp="2026-02-01 07:08:32 +0000 UTC" firstStartedPulling="2026-02-01 07:08:34.351792873 +0000 UTC m=+1264.837695246" lastFinishedPulling="2026-02-01 07:09:13.749757977 +0000 UTC m=+1304.235660340" observedRunningTime="2026-02-01 07:09:15.089324895 +0000 UTC m=+1305.575227258" watchObservedRunningTime="2026-02-01 07:09:15.099740337 +0000 UTC m=+1305.585642700" Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.081841 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.122734 5127 generic.go:334] "Generic (PLEG): container finished" podID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerID="8b6cf0fce9191b5f92f6dba5f32440813d779ca5c4bfc287007eb0b70522a42e" exitCode=0 Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.122764 5127 generic.go:334] "Generic (PLEG): container finished" podID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerID="7793537fc2a3ab52b59146e273aa2bff9c70fc06368a4e4572165b5882b98880" exitCode=143 Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.122869 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" event={"ID":"bcd541f4-33fd-42c1-a5af-5b9b1ddee054","Type":"ContainerDied","Data":"8b6cf0fce9191b5f92f6dba5f32440813d779ca5c4bfc287007eb0b70522a42e"} Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.122895 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" event={"ID":"bcd541f4-33fd-42c1-a5af-5b9b1ddee054","Type":"ContainerDied","Data":"7793537fc2a3ab52b59146e273aa2bff9c70fc06368a4e4572165b5882b98880"} Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.124880 5127 generic.go:334] "Generic (PLEG): container finished" podID="d876e519-4139-4791-a0a7-bb9878e91a72" containerID="9cb36f2e85f77ff44283559edf3708d36ee5e3318b541875332b0b29e666852a" exitCode=0 Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.124900 5127 generic.go:334] "Generic (PLEG): container finished" podID="d876e519-4139-4791-a0a7-bb9878e91a72" containerID="ff748cfb27efa83582460b877690b1b4a2259f655935539f98a139ae9ff125cf" exitCode=143 Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.125818 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" event={"ID":"d876e519-4139-4791-a0a7-bb9878e91a72","Type":"ContainerDied","Data":"9cb36f2e85f77ff44283559edf3708d36ee5e3318b541875332b0b29e666852a"} Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.125847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" event={"ID":"d876e519-4139-4791-a0a7-bb9878e91a72","Type":"ContainerDied","Data":"ff748cfb27efa83582460b877690b1b4a2259f655935539f98a139ae9ff125cf"} Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.186123 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-ljm24"] Feb 01 07:09:16 crc kubenswrapper[5127]: I0201 07:09:16.186407 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerName="dnsmasq-dns" containerID="cri-o://036f77806ebb36bb54244e2f4b5da15343f6663f39a060b67eef97821d693e78" gracePeriod=10 Feb 01 07:09:17 crc kubenswrapper[5127]: I0201 07:09:17.135869 5127 generic.go:334] "Generic (PLEG): container finished" podID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerID="036f77806ebb36bb54244e2f4b5da15343f6663f39a060b67eef97821d693e78" exitCode=0 Feb 01 07:09:17 crc kubenswrapper[5127]: I0201 07:09:17.136033 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" event={"ID":"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb","Type":"ContainerDied","Data":"036f77806ebb36bb54244e2f4b5da15343f6663f39a060b67eef97821d693e78"} Feb 01 07:09:17 crc kubenswrapper[5127]: I0201 07:09:17.933767 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:18 crc kubenswrapper[5127]: I0201 07:09:18.102018 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:18 crc kubenswrapper[5127]: I0201 07:09:18.893986 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: connect: connection refused" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.506542 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.520532 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.606343 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.634646 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrhmj\" (UniqueName: \"kubernetes.io/projected/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-kube-api-access-qrhmj\") pod \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.634719 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-svc\") pod \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.634763 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-swift-storage-0\") pod \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.634875 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-nb\") pod \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.634896 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-config\") pod \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.634915 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nh29\" (UniqueName: \"kubernetes.io/projected/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-kube-api-access-4nh29\") pod \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.634954 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data\") pod \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.635039 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-combined-ca-bundle\") pod \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.635074 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data-custom\") pod \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.635111 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-logs\") pod \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\" (UID: \"bcd541f4-33fd-42c1-a5af-5b9b1ddee054\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.635152 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-sb\") pod \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\" (UID: \"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.637130 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-logs" (OuterVolumeSpecName: "logs") pod "bcd541f4-33fd-42c1-a5af-5b9b1ddee054" (UID: "bcd541f4-33fd-42c1-a5af-5b9b1ddee054"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.642888 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bcd541f4-33fd-42c1-a5af-5b9b1ddee054" (UID: "bcd541f4-33fd-42c1-a5af-5b9b1ddee054"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.653822 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-kube-api-access-4nh29" (OuterVolumeSpecName: "kube-api-access-4nh29") pod "bcd541f4-33fd-42c1-a5af-5b9b1ddee054" (UID: "bcd541f4-33fd-42c1-a5af-5b9b1ddee054"). InnerVolumeSpecName "kube-api-access-4nh29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.654803 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-kube-api-access-qrhmj" (OuterVolumeSpecName: "kube-api-access-qrhmj") pod "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" (UID: "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb"). InnerVolumeSpecName "kube-api-access-qrhmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.672117 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcd541f4-33fd-42c1-a5af-5b9b1ddee054" (UID: "bcd541f4-33fd-42c1-a5af-5b9b1ddee054"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.688968 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-config" (OuterVolumeSpecName: "config") pod "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" (UID: "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.699956 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data" (OuterVolumeSpecName: "config-data") pod "bcd541f4-33fd-42c1-a5af-5b9b1ddee054" (UID: "bcd541f4-33fd-42c1-a5af-5b9b1ddee054"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.701172 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" (UID: "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.707194 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" (UID: "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.721903 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" (UID: "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.724792 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" (UID: "f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.737253 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4rvt\" (UniqueName: \"kubernetes.io/projected/d876e519-4139-4791-a0a7-bb9878e91a72-kube-api-access-v4rvt\") pod \"d876e519-4139-4791-a0a7-bb9878e91a72\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.737330 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-combined-ca-bundle\") pod \"d876e519-4139-4791-a0a7-bb9878e91a72\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.737474 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d876e519-4139-4791-a0a7-bb9878e91a72-logs\") pod \"d876e519-4139-4791-a0a7-bb9878e91a72\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.737650 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data\") pod \"d876e519-4139-4791-a0a7-bb9878e91a72\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.737709 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data-custom\") pod \"d876e519-4139-4791-a0a7-bb9878e91a72\" (UID: \"d876e519-4139-4791-a0a7-bb9878e91a72\") " Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738205 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738237 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738254 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738275 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nh29\" (UniqueName: \"kubernetes.io/projected/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-kube-api-access-4nh29\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738294 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738309 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738323 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738337 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcd541f4-33fd-42c1-a5af-5b9b1ddee054-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738350 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738365 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrhmj\" (UniqueName: \"kubernetes.io/projected/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-kube-api-access-qrhmj\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738382 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.738263 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d876e519-4139-4791-a0a7-bb9878e91a72-logs" (OuterVolumeSpecName: "logs") pod "d876e519-4139-4791-a0a7-bb9878e91a72" (UID: "d876e519-4139-4791-a0a7-bb9878e91a72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.741343 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d876e519-4139-4791-a0a7-bb9878e91a72-kube-api-access-v4rvt" (OuterVolumeSpecName: "kube-api-access-v4rvt") pod "d876e519-4139-4791-a0a7-bb9878e91a72" (UID: "d876e519-4139-4791-a0a7-bb9878e91a72"). InnerVolumeSpecName "kube-api-access-v4rvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.741679 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d876e519-4139-4791-a0a7-bb9878e91a72" (UID: "d876e519-4139-4791-a0a7-bb9878e91a72"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.757495 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d876e519-4139-4791-a0a7-bb9878e91a72" (UID: "d876e519-4139-4791-a0a7-bb9878e91a72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.784557 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data" (OuterVolumeSpecName: "config-data") pod "d876e519-4139-4791-a0a7-bb9878e91a72" (UID: "d876e519-4139-4791-a0a7-bb9878e91a72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.840465 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.840536 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.840564 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4rvt\" (UniqueName: \"kubernetes.io/projected/d876e519-4139-4791-a0a7-bb9878e91a72-kube-api-access-v4rvt\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.840625 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d876e519-4139-4791-a0a7-bb9878e91a72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:19 crc kubenswrapper[5127]: I0201 07:09:19.840650 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d876e519-4139-4791-a0a7-bb9878e91a72-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.176958 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" event={"ID":"d876e519-4139-4791-a0a7-bb9878e91a72","Type":"ContainerDied","Data":"9b91679550b2b947c80e3ba413f848575dd11ae15b1e592730c97a168a97b901"} Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.177005 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b9d896d98-9c696" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.177316 5127 scope.go:117] "RemoveContainer" containerID="9cb36f2e85f77ff44283559edf3708d36ee5e3318b541875332b0b29e666852a" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.179354 5127 generic.go:334] "Generic (PLEG): container finished" podID="4f4d5a37-3a02-493f-9cf9-d53931c2a92b" containerID="25aceb3b56d27bad63c6a39a8a7c21031da417b927c23a72504a99b04f2dbf18" exitCode=0 Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.179383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-w7586" event={"ID":"4f4d5a37-3a02-493f-9cf9-d53931c2a92b","Type":"ContainerDied","Data":"25aceb3b56d27bad63c6a39a8a7c21031da417b927c23a72504a99b04f2dbf18"} Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.182048 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" event={"ID":"f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb","Type":"ContainerDied","Data":"29eb6a8736a554e78d4c3df0e0597260b6b9057bcde17336fc80817881f9889c"} Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.182420 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-ljm24" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.185352 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerStarted","Data":"063a7734071689591e21001d6f69590a344b9f5557ce25b35bcba7daf76b2ff7"} Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.185497 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-central-agent" containerID="cri-o://2fcf09cf738ee71b3db6533f0029a17da44d8dbcd16a1db45a01f5156188fe21" gracePeriod=30 Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.185647 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.185722 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="proxy-httpd" containerID="cri-o://063a7734071689591e21001d6f69590a344b9f5557ce25b35bcba7daf76b2ff7" gracePeriod=30 Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.185783 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="sg-core" containerID="cri-o://1a00688a65ac59b4c19a05a44e0462a0f9a26a6480040c8e5c8fa08f5e4560c3" gracePeriod=30 Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.185836 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-notification-agent" containerID="cri-o://28ed629b545fff6fa4fe5b198e94d2adaf4720408ded6f5999779c65743ee58c" gracePeriod=30 Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.194487 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" event={"ID":"bcd541f4-33fd-42c1-a5af-5b9b1ddee054","Type":"ContainerDied","Data":"e19f1716ecd2b0c87f65a556845252df1fb29de1fbbf838f3f79a2a579069954"} Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.194644 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6b9c8775f7-zkz2s" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.226278 5127 scope.go:117] "RemoveContainer" containerID="ff748cfb27efa83582460b877690b1b4a2259f655935539f98a139ae9ff125cf" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.229721 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-b9d896d98-9c696"] Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.257282 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-b9d896d98-9c696"] Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.260291 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.234759989 podStartE2EDuration="48.260268124s" podCreationTimestamp="2026-02-01 07:08:32 +0000 UTC" firstStartedPulling="2026-02-01 07:08:34.339440039 +0000 UTC m=+1264.825342402" lastFinishedPulling="2026-02-01 07:09:19.364948174 +0000 UTC m=+1309.850850537" observedRunningTime="2026-02-01 07:09:20.241924248 +0000 UTC m=+1310.727826621" watchObservedRunningTime="2026-02-01 07:09:20.260268124 +0000 UTC m=+1310.746170487" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.263753 5127 scope.go:117] "RemoveContainer" containerID="036f77806ebb36bb54244e2f4b5da15343f6663f39a060b67eef97821d693e78" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.273613 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6b9c8775f7-zkz2s"] Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.283440 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6b9c8775f7-zkz2s"] Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.291352 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-ljm24"] Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.300178 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-ljm24"] Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.373463 5127 scope.go:117] "RemoveContainer" containerID="c9d8727af711b74e55d846dc4fd82093551dc35a13c0d59410e7ab90746e2061" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.405992 5127 scope.go:117] "RemoveContainer" containerID="8b6cf0fce9191b5f92f6dba5f32440813d779ca5c4bfc287007eb0b70522a42e" Feb 01 07:09:20 crc kubenswrapper[5127]: I0201 07:09:20.463437 5127 scope.go:117] "RemoveContainer" containerID="7793537fc2a3ab52b59146e273aa2bff9c70fc06368a4e4572165b5882b98880" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.223982 5127 generic.go:334] "Generic (PLEG): container finished" podID="de51194a-4317-47c7-a5a8-cb81905825f2" containerID="063a7734071689591e21001d6f69590a344b9f5557ce25b35bcba7daf76b2ff7" exitCode=0 Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.224310 5127 generic.go:334] "Generic (PLEG): container finished" podID="de51194a-4317-47c7-a5a8-cb81905825f2" containerID="1a00688a65ac59b4c19a05a44e0462a0f9a26a6480040c8e5c8fa08f5e4560c3" exitCode=2 Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.224334 5127 generic.go:334] "Generic (PLEG): container finished" podID="de51194a-4317-47c7-a5a8-cb81905825f2" containerID="2fcf09cf738ee71b3db6533f0029a17da44d8dbcd16a1db45a01f5156188fe21" exitCode=0 Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.224560 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerDied","Data":"063a7734071689591e21001d6f69590a344b9f5557ce25b35bcba7daf76b2ff7"} Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.224625 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerDied","Data":"1a00688a65ac59b4c19a05a44e0462a0f9a26a6480040c8e5c8fa08f5e4560c3"} Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.224667 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerDied","Data":"2fcf09cf738ee71b3db6533f0029a17da44d8dbcd16a1db45a01f5156188fe21"} Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.623010 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-w7586" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.782461 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-scripts\") pod \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.782955 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-db-sync-config-data\") pod \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.782995 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sjg2\" (UniqueName: \"kubernetes.io/projected/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-kube-api-access-5sjg2\") pod \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.783062 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-combined-ca-bundle\") pod \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.783157 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-config-data\") pod \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.783195 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-etc-machine-id\") pod \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\" (UID: \"4f4d5a37-3a02-493f-9cf9-d53931c2a92b\") " Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.783810 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4f4d5a37-3a02-493f-9cf9-d53931c2a92b" (UID: "4f4d5a37-3a02-493f-9cf9-d53931c2a92b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.788129 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-kube-api-access-5sjg2" (OuterVolumeSpecName: "kube-api-access-5sjg2") pod "4f4d5a37-3a02-493f-9cf9-d53931c2a92b" (UID: "4f4d5a37-3a02-493f-9cf9-d53931c2a92b"). InnerVolumeSpecName "kube-api-access-5sjg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.788808 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4f4d5a37-3a02-493f-9cf9-d53931c2a92b" (UID: "4f4d5a37-3a02-493f-9cf9-d53931c2a92b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.788866 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-scripts" (OuterVolumeSpecName: "scripts") pod "4f4d5a37-3a02-493f-9cf9-d53931c2a92b" (UID: "4f4d5a37-3a02-493f-9cf9-d53931c2a92b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.829687 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f4d5a37-3a02-493f-9cf9-d53931c2a92b" (UID: "4f4d5a37-3a02-493f-9cf9-d53931c2a92b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.850827 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-config-data" (OuterVolumeSpecName: "config-data") pod "4f4d5a37-3a02-493f-9cf9-d53931c2a92b" (UID: "4f4d5a37-3a02-493f-9cf9-d53931c2a92b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.885206 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.885250 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.885267 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.885282 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.885297 5127 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:21 crc kubenswrapper[5127]: I0201 07:09:21.885309 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sjg2\" (UniqueName: \"kubernetes.io/projected/4f4d5a37-3a02-493f-9cf9-d53931c2a92b-kube-api-access-5sjg2\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.240103 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-w7586" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.276652 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" path="/var/lib/kubelet/pods/bcd541f4-33fd-42c1-a5af-5b9b1ddee054/volumes" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.285168 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" path="/var/lib/kubelet/pods/d876e519-4139-4791-a0a7-bb9878e91a72/volumes" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.286458 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" path="/var/lib/kubelet/pods/f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb/volumes" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.287530 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.287564 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-w7586" event={"ID":"4f4d5a37-3a02-493f-9cf9-d53931c2a92b","Type":"ContainerDied","Data":"53f109a81663f0aab4b3b2f197ee158e05be36f0618b86015ac8d29afd5998fe"} Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.287688 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53f109a81663f0aab4b3b2f197ee158e05be36f0618b86015ac8d29afd5998fe" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.287834 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.451091 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56d87964d8-rmv9v"] Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.451511 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56d87964d8-rmv9v" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api-log" containerID="cri-o://2595d723318172e4cd538176cf76ad6898d913735e3ef93149f84fa24866fc73" gracePeriod=30 Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.452019 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56d87964d8-rmv9v" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api" containerID="cri-o://0ccf2b7841327abd2475fd89b38cce540d58bccf850d4bc3cdc5becd6eb10e22" gracePeriod=30 Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.480142 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d87964d8-rmv9v" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": EOF" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.687699 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689052 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689087 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689101 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4d5a37-3a02-493f-9cf9-d53931c2a92b" containerName="cinder-db-sync" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689107 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4d5a37-3a02-493f-9cf9-d53931c2a92b" containerName="cinder-db-sync" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689118 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689124 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689149 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689155 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689164 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerName="dnsmasq-dns" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689169 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerName="dnsmasq-dns" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689183 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker-log" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689200 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker-log" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689213 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api-log" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689218 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api-log" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689227 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener-log" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689234 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener-log" Feb 01 07:09:22 crc kubenswrapper[5127]: E0201 07:09:22.689241 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerName="init" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689247 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerName="init" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689398 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker-log" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689410 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener-log" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689419 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d876e519-4139-4791-a0a7-bb9878e91a72" containerName="barbican-keystone-listener" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689427 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcd541f4-33fd-42c1-a5af-5b9b1ddee054" containerName="barbican-worker" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689442 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f80e7fbd-d4ce-4c3b-9869-b0c0ebd955fb" containerName="dnsmasq-dns" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689448 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689456 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c77e93a-7960-4176-a1a9-907b8118f7a4" containerName="barbican-api-log" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.689464 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4d5a37-3a02-493f-9cf9-d53931c2a92b" containerName="cinder-db-sync" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.690395 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.695199 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.695780 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8l8d5" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.695877 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.695885 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.706727 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.766638 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-lm7b9"] Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.768162 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.782558 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-lm7b9"] Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.821626 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fv85\" (UniqueName: \"kubernetes.io/projected/474c7fb9-fcd9-48aa-9287-d114245f9a63-kube-api-access-7fv85\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.821695 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.821784 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.821810 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/474c7fb9-fcd9-48aa-9287-d114245f9a63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.821850 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.821873 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-scripts\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.865000 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.871540 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.875554 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.877723 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923615 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-scripts\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923688 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fv85\" (UniqueName: \"kubernetes.io/projected/474c7fb9-fcd9-48aa-9287-d114245f9a63-kube-api-access-7fv85\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923746 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923785 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-config\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923815 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923882 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45p7k\" (UniqueName: \"kubernetes.io/projected/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-kube-api-access-45p7k\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923911 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923941 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.923993 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.924020 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/474c7fb9-fcd9-48aa-9287-d114245f9a63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.924050 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.924108 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.925794 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/474c7fb9-fcd9-48aa-9287-d114245f9a63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.929989 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-scripts\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.932671 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.934154 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.942185 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:22 crc kubenswrapper[5127]: I0201 07:09:22.958260 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fv85\" (UniqueName: \"kubernetes.io/projected/474c7fb9-fcd9-48aa-9287-d114245f9a63-kube-api-access-7fv85\") pod \"cinder-scheduler-0\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.013392 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025600 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-scripts\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025654 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-config\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025677 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025860 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45p7k\" (UniqueName: \"kubernetes.io/projected/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-kube-api-access-45p7k\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025916 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025939 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hssf\" (UniqueName: \"kubernetes.io/projected/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-kube-api-access-6hssf\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025970 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.025993 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.026096 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.026140 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-logs\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.026191 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.026291 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.026331 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.026463 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.030215 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.030215 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.030284 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.031278 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-config\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.047252 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45p7k\" (UniqueName: \"kubernetes.io/projected/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-kube-api-access-45p7k\") pod \"dnsmasq-dns-75bfc9b94f-lm7b9\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.094768 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.131720 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-logs\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.131977 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.132030 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.132076 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.132100 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-scripts\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.132155 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hssf\" (UniqueName: \"kubernetes.io/projected/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-kube-api-access-6hssf\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.132177 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.132262 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.132321 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-logs\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.135409 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data-custom\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.142724 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.143437 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.143815 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-scripts\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.158145 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hssf\" (UniqueName: \"kubernetes.io/projected/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-kube-api-access-6hssf\") pod \"cinder-api-0\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.189516 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.292324 5127 generic.go:334] "Generic (PLEG): container finished" podID="de51194a-4317-47c7-a5a8-cb81905825f2" containerID="28ed629b545fff6fa4fe5b198e94d2adaf4720408ded6f5999779c65743ee58c" exitCode=0 Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.292408 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerDied","Data":"28ed629b545fff6fa4fe5b198e94d2adaf4720408ded6f5999779c65743ee58c"} Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.313991 5127 generic.go:334] "Generic (PLEG): container finished" podID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerID="2595d723318172e4cd538176cf76ad6898d913735e3ef93149f84fa24866fc73" exitCode=143 Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.314659 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d87964d8-rmv9v" event={"ID":"a2063c29-7f15-4f1d-a669-c3d2a303bc57","Type":"ContainerDied","Data":"2595d723318172e4cd538176cf76ad6898d913735e3ef93149f84fa24866fc73"} Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.516995 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:23 crc kubenswrapper[5127]: W0201 07:09:23.523776 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474c7fb9_fcd9_48aa_9287_d114245f9a63.slice/crio-9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312 WatchSource:0}: Error finding container 9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312: Status 404 returned error can't find the container with id 9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312 Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.568766 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.710225 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-lm7b9"] Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.742546 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753003 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-log-httpd\") pod \"de51194a-4317-47c7-a5a8-cb81905825f2\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753085 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9hk9\" (UniqueName: \"kubernetes.io/projected/de51194a-4317-47c7-a5a8-cb81905825f2-kube-api-access-w9hk9\") pod \"de51194a-4317-47c7-a5a8-cb81905825f2\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753104 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-config-data\") pod \"de51194a-4317-47c7-a5a8-cb81905825f2\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753206 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-run-httpd\") pod \"de51194a-4317-47c7-a5a8-cb81905825f2\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753237 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-combined-ca-bundle\") pod \"de51194a-4317-47c7-a5a8-cb81905825f2\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753291 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-scripts\") pod \"de51194a-4317-47c7-a5a8-cb81905825f2\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753328 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-sg-core-conf-yaml\") pod \"de51194a-4317-47c7-a5a8-cb81905825f2\" (UID: \"de51194a-4317-47c7-a5a8-cb81905825f2\") " Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.753740 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de51194a-4317-47c7-a5a8-cb81905825f2" (UID: "de51194a-4317-47c7-a5a8-cb81905825f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.756763 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de51194a-4317-47c7-a5a8-cb81905825f2" (UID: "de51194a-4317-47c7-a5a8-cb81905825f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.760054 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-scripts" (OuterVolumeSpecName: "scripts") pod "de51194a-4317-47c7-a5a8-cb81905825f2" (UID: "de51194a-4317-47c7-a5a8-cb81905825f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.770010 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de51194a-4317-47c7-a5a8-cb81905825f2-kube-api-access-w9hk9" (OuterVolumeSpecName: "kube-api-access-w9hk9") pod "de51194a-4317-47c7-a5a8-cb81905825f2" (UID: "de51194a-4317-47c7-a5a8-cb81905825f2"). InnerVolumeSpecName "kube-api-access-w9hk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.786750 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de51194a-4317-47c7-a5a8-cb81905825f2" (UID: "de51194a-4317-47c7-a5a8-cb81905825f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.858748 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.858982 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.858991 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.859000 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de51194a-4317-47c7-a5a8-cb81905825f2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.859010 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9hk9\" (UniqueName: \"kubernetes.io/projected/de51194a-4317-47c7-a5a8-cb81905825f2-kube-api-access-w9hk9\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.868204 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de51194a-4317-47c7-a5a8-cb81905825f2" (UID: "de51194a-4317-47c7-a5a8-cb81905825f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.915811 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-config-data" (OuterVolumeSpecName: "config-data") pod "de51194a-4317-47c7-a5a8-cb81905825f2" (UID: "de51194a-4317-47c7-a5a8-cb81905825f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.960412 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:23 crc kubenswrapper[5127]: I0201 07:09:23.960446 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de51194a-4317-47c7-a5a8-cb81905825f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.329518 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de51194a-4317-47c7-a5a8-cb81905825f2","Type":"ContainerDied","Data":"3757d0918db759aa076fcc4f438945e528d5edbfbb3a417cba097b29b486f0d1"} Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.329570 5127 scope.go:117] "RemoveContainer" containerID="063a7734071689591e21001d6f69590a344b9f5557ce25b35bcba7daf76b2ff7" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.329718 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.339603 5127 generic.go:334] "Generic (PLEG): container finished" podID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerID="fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02" exitCode=0 Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.339653 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" event={"ID":"28fa3b9a-8a1d-4954-89eb-6bf203c729d2","Type":"ContainerDied","Data":"fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02"} Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.339676 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" event={"ID":"28fa3b9a-8a1d-4954-89eb-6bf203c729d2","Type":"ContainerStarted","Data":"ca21adb9e284a0633afbc3c173e31c2cdefe126ac7fe4580fc56eae716e78b85"} Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.345165 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f","Type":"ContainerStarted","Data":"051bef1682bbe3a86a72137d0dd245acfc06d96b49f38c6f33136027c62f85a7"} Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.350065 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"474c7fb9-fcd9-48aa-9287-d114245f9a63","Type":"ContainerStarted","Data":"9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312"} Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.352697 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.360937 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.373387 5127 scope.go:117] "RemoveContainer" containerID="1a00688a65ac59b4c19a05a44e0462a0f9a26a6480040c8e5c8fa08f5e4560c3" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.394703 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:24 crc kubenswrapper[5127]: E0201 07:09:24.395097 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="sg-core" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395113 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="sg-core" Feb 01 07:09:24 crc kubenswrapper[5127]: E0201 07:09:24.395145 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-central-agent" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395151 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-central-agent" Feb 01 07:09:24 crc kubenswrapper[5127]: E0201 07:09:24.395161 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-notification-agent" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395167 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-notification-agent" Feb 01 07:09:24 crc kubenswrapper[5127]: E0201 07:09:24.395178 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="proxy-httpd" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395190 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="proxy-httpd" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395334 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="proxy-httpd" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395352 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="sg-core" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395363 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-central-agent" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.395374 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" containerName="ceilometer-notification-agent" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.396850 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.398928 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.399285 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.405415 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.409267 5127 scope.go:117] "RemoveContainer" containerID="28ed629b545fff6fa4fe5b198e94d2adaf4720408ded6f5999779c65743ee58c" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.449006 5127 scope.go:117] "RemoveContainer" containerID="2fcf09cf738ee71b3db6533f0029a17da44d8dbcd16a1db45a01f5156188fe21" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.479535 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.479601 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.479652 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-log-httpd\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.479700 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2df6\" (UniqueName: \"kubernetes.io/projected/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-kube-api-access-n2df6\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.479735 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-run-httpd\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.479775 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-scripts\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.479805 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-config-data\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581132 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2df6\" (UniqueName: \"kubernetes.io/projected/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-kube-api-access-n2df6\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581197 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-run-httpd\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581248 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-scripts\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581280 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-config-data\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581308 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581335 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581357 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-log-httpd\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.581841 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-log-httpd\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.584010 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-run-httpd\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.585887 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.589276 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.591119 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-config-data\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.596416 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-scripts\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.615475 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2df6\" (UniqueName: \"kubernetes.io/projected/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-kube-api-access-n2df6\") pod \"ceilometer-0\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " pod="openstack/ceilometer-0" Feb 01 07:09:24 crc kubenswrapper[5127]: I0201 07:09:24.728794 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.241949 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.362302 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f","Type":"ContainerStarted","Data":"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f"} Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.362659 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.362680 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f","Type":"ContainerStarted","Data":"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e"} Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.364003 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerStarted","Data":"8e9c9aa97c763483ffc2ec47571bc968faf3f2bdd45807b97205ecb9ce7e2d5d"} Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.366103 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"474c7fb9-fcd9-48aa-9287-d114245f9a63","Type":"ContainerStarted","Data":"abd82f6068c6b870949dbb4a1eb5e0e72b3bf789bd0927a57568701cf1376c0d"} Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.370976 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" event={"ID":"28fa3b9a-8a1d-4954-89eb-6bf203c729d2","Type":"ContainerStarted","Data":"bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa"} Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.371255 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.385761 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.385744973 podStartE2EDuration="3.385744973s" podCreationTimestamp="2026-02-01 07:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:25.383521093 +0000 UTC m=+1315.869423456" watchObservedRunningTime="2026-02-01 07:09:25.385744973 +0000 UTC m=+1315.871647336" Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.411697 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" podStartSLOduration=3.411680214 podStartE2EDuration="3.411680214s" podCreationTimestamp="2026-02-01 07:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:25.403886243 +0000 UTC m=+1315.889788606" watchObservedRunningTime="2026-02-01 07:09:25.411680214 +0000 UTC m=+1315.897582567" Feb 01 07:09:25 crc kubenswrapper[5127]: I0201 07:09:25.717383 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:26 crc kubenswrapper[5127]: I0201 07:09:26.245750 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de51194a-4317-47c7-a5a8-cb81905825f2" path="/var/lib/kubelet/pods/de51194a-4317-47c7-a5a8-cb81905825f2/volumes" Feb 01 07:09:26 crc kubenswrapper[5127]: I0201 07:09:26.380564 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerStarted","Data":"057a4d574e0ec1b0de1199d96913ec4dc9c37bd1fdd9c99e4893cf74f9fe7fa8"} Feb 01 07:09:26 crc kubenswrapper[5127]: I0201 07:09:26.382210 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"474c7fb9-fcd9-48aa-9287-d114245f9a63","Type":"ContainerStarted","Data":"00623722cdf36847bbb66130f352751cee73adb4f1fe72b6f886f110dbebf240"} Feb 01 07:09:26 crc kubenswrapper[5127]: I0201 07:09:26.411282 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.5539429890000003 podStartE2EDuration="4.411255942s" podCreationTimestamp="2026-02-01 07:09:22 +0000 UTC" firstStartedPulling="2026-02-01 07:09:23.528741779 +0000 UTC m=+1314.014644142" lastFinishedPulling="2026-02-01 07:09:24.386054732 +0000 UTC m=+1314.871957095" observedRunningTime="2026-02-01 07:09:26.402017003 +0000 UTC m=+1316.887919366" watchObservedRunningTime="2026-02-01 07:09:26.411255942 +0000 UTC m=+1316.897158305" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.019968 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d87964d8-rmv9v" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:50914->10.217.0.163:9311: read: connection reset by peer" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.020011 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d87964d8-rmv9v" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:50898->10.217.0.163:9311: read: connection reset by peer" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.392786 5127 generic.go:334] "Generic (PLEG): container finished" podID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerID="0ccf2b7841327abd2475fd89b38cce540d58bccf850d4bc3cdc5becd6eb10e22" exitCode=0 Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.392887 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d87964d8-rmv9v" event={"ID":"a2063c29-7f15-4f1d-a669-c3d2a303bc57","Type":"ContainerDied","Data":"0ccf2b7841327abd2475fd89b38cce540d58bccf850d4bc3cdc5becd6eb10e22"} Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.400071 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerStarted","Data":"273a06a7fd0d012d844bcca367a3156b15003fceeeae2aa571bee83db57e01ce"} Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.400429 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api-log" containerID="cri-o://0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e" gracePeriod=30 Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.400989 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api" containerID="cri-o://435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f" gracePeriod=30 Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.479099 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.554898 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data\") pod \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.555025 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj6ts\" (UniqueName: \"kubernetes.io/projected/a2063c29-7f15-4f1d-a669-c3d2a303bc57-kube-api-access-bj6ts\") pod \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.555095 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data-custom\") pod \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.555169 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2063c29-7f15-4f1d-a669-c3d2a303bc57-logs\") pod \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.555226 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-combined-ca-bundle\") pod \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\" (UID: \"a2063c29-7f15-4f1d-a669-c3d2a303bc57\") " Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.557086 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2063c29-7f15-4f1d-a669-c3d2a303bc57-logs" (OuterVolumeSpecName: "logs") pod "a2063c29-7f15-4f1d-a669-c3d2a303bc57" (UID: "a2063c29-7f15-4f1d-a669-c3d2a303bc57"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.570853 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2063c29-7f15-4f1d-a669-c3d2a303bc57-kube-api-access-bj6ts" (OuterVolumeSpecName: "kube-api-access-bj6ts") pod "a2063c29-7f15-4f1d-a669-c3d2a303bc57" (UID: "a2063c29-7f15-4f1d-a669-c3d2a303bc57"). InnerVolumeSpecName "kube-api-access-bj6ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.586337 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a2063c29-7f15-4f1d-a669-c3d2a303bc57" (UID: "a2063c29-7f15-4f1d-a669-c3d2a303bc57"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.642982 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2063c29-7f15-4f1d-a669-c3d2a303bc57" (UID: "a2063c29-7f15-4f1d-a669-c3d2a303bc57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.656969 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2063c29-7f15-4f1d-a669-c3d2a303bc57-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.657022 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.657034 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj6ts\" (UniqueName: \"kubernetes.io/projected/a2063c29-7f15-4f1d-a669-c3d2a303bc57-kube-api-access-bj6ts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.657044 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.680867 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data" (OuterVolumeSpecName: "config-data") pod "a2063c29-7f15-4f1d-a669-c3d2a303bc57" (UID: "a2063c29-7f15-4f1d-a669-c3d2a303bc57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:27 crc kubenswrapper[5127]: I0201 07:09:27.758093 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2063c29-7f15-4f1d-a669-c3d2a303bc57-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.027799 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.276266 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369118 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-etc-machine-id\") pod \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369203 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-scripts\") pod \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369313 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hssf\" (UniqueName: \"kubernetes.io/projected/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-kube-api-access-6hssf\") pod \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369339 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data\") pod \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369379 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data-custom\") pod \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369414 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-logs\") pod \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369453 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-combined-ca-bundle\") pod \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\" (UID: \"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f\") " Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.369926 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" (UID: "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.370298 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-logs" (OuterVolumeSpecName: "logs") pod "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" (UID: "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.374570 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" (UID: "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.375455 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-kube-api-access-6hssf" (OuterVolumeSpecName: "kube-api-access-6hssf") pod "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" (UID: "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f"). InnerVolumeSpecName "kube-api-access-6hssf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.377638 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-scripts" (OuterVolumeSpecName: "scripts") pod "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" (UID: "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.400696 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" (UID: "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.414096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d87964d8-rmv9v" event={"ID":"a2063c29-7f15-4f1d-a669-c3d2a303bc57","Type":"ContainerDied","Data":"4e6a99faf8229c47fafa147a15596fcc59573ee65e2838ff527c7f0b7195093b"} Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.414369 5127 scope.go:117] "RemoveContainer" containerID="0ccf2b7841327abd2475fd89b38cce540d58bccf850d4bc3cdc5becd6eb10e22" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.414338 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d87964d8-rmv9v" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.420397 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerStarted","Data":"3ff311aebfb00ef708f5a11381da08386c244e4de1fc53799498ee06df024461"} Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.422948 5127 generic.go:334] "Generic (PLEG): container finished" podID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerID="435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f" exitCode=0 Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.423169 5127 generic.go:334] "Generic (PLEG): container finished" podID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerID="0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e" exitCode=143 Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.423123 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.423150 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f","Type":"ContainerDied","Data":"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f"} Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.424289 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f","Type":"ContainerDied","Data":"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e"} Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.424386 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f","Type":"ContainerDied","Data":"051bef1682bbe3a86a72137d0dd245acfc06d96b49f38c6f33136027c62f85a7"} Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.435103 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data" (OuterVolumeSpecName: "config-data") pod "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" (UID: "b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.443369 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56d87964d8-rmv9v"] Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.445828 5127 scope.go:117] "RemoveContainer" containerID="2595d723318172e4cd538176cf76ad6898d913735e3ef93149f84fa24866fc73" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.451429 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56d87964d8-rmv9v"] Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.462197 5127 scope.go:117] "RemoveContainer" containerID="435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.472170 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hssf\" (UniqueName: \"kubernetes.io/projected/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-kube-api-access-6hssf\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.472195 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.472203 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.472211 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.472219 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.472228 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.472236 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.480221 5127 scope.go:117] "RemoveContainer" containerID="0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.496125 5127 scope.go:117] "RemoveContainer" containerID="435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f" Feb 01 07:09:28 crc kubenswrapper[5127]: E0201 07:09:28.496602 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f\": container with ID starting with 435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f not found: ID does not exist" containerID="435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.496640 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f"} err="failed to get container status \"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f\": rpc error: code = NotFound desc = could not find container \"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f\": container with ID starting with 435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f not found: ID does not exist" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.496674 5127 scope.go:117] "RemoveContainer" containerID="0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e" Feb 01 07:09:28 crc kubenswrapper[5127]: E0201 07:09:28.496977 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e\": container with ID starting with 0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e not found: ID does not exist" containerID="0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.497010 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e"} err="failed to get container status \"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e\": rpc error: code = NotFound desc = could not find container \"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e\": container with ID starting with 0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e not found: ID does not exist" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.497031 5127 scope.go:117] "RemoveContainer" containerID="435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.497296 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f"} err="failed to get container status \"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f\": rpc error: code = NotFound desc = could not find container \"435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f\": container with ID starting with 435e7fcdd35371ceac58f5e1179adb56afcefef09ac3a3728f0bae86d1ace73f not found: ID does not exist" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.497322 5127 scope.go:117] "RemoveContainer" containerID="0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.497563 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e"} err="failed to get container status \"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e\": rpc error: code = NotFound desc = could not find container \"0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e\": container with ID starting with 0b09b46b8708370001703199f04bae5cbd184c7f908b511a3f358f6dd3efb97e not found: ID does not exist" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.752641 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.759748 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.783241 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:28 crc kubenswrapper[5127]: E0201 07:09:28.783799 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api-log" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.783867 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api-log" Feb 01 07:09:28 crc kubenswrapper[5127]: E0201 07:09:28.783960 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.784015 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api" Feb 01 07:09:28 crc kubenswrapper[5127]: E0201 07:09:28.784086 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api-log" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.784138 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api-log" Feb 01 07:09:28 crc kubenswrapper[5127]: E0201 07:09:28.784381 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.784431 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.786261 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api-log" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.786400 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" containerName="cinder-api" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.786466 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.786523 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" containerName="barbican-api-log" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.787639 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.790283 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.790680 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.791789 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796433 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796509 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796546 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796573 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdafa63d-9b24-454c-a217-e53024719e75-logs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796618 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdafa63d-9b24-454c-a217-e53024719e75-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796666 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796702 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-scripts\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796724 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data-custom\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.796745 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48fw\" (UniqueName: \"kubernetes.io/projected/cdafa63d-9b24-454c-a217-e53024719e75-kube-api-access-q48fw\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.799132 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.898684 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-scripts\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.898758 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data-custom\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.898790 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48fw\" (UniqueName: \"kubernetes.io/projected/cdafa63d-9b24-454c-a217-e53024719e75-kube-api-access-q48fw\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.900627 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.900767 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.900828 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.900862 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdafa63d-9b24-454c-a217-e53024719e75-logs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.900902 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdafa63d-9b24-454c-a217-e53024719e75-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.900987 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.901362 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdafa63d-9b24-454c-a217-e53024719e75-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.901760 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdafa63d-9b24-454c-a217-e53024719e75-logs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.902429 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-scripts\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.903894 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data-custom\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.909120 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.911657 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.913041 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.914884 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:28 crc kubenswrapper[5127]: I0201 07:09:28.919486 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48fw\" (UniqueName: \"kubernetes.io/projected/cdafa63d-9b24-454c-a217-e53024719e75-kube-api-access-q48fw\") pod \"cinder-api-0\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " pod="openstack/cinder-api-0" Feb 01 07:09:29 crc kubenswrapper[5127]: I0201 07:09:29.105921 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:09:29 crc kubenswrapper[5127]: I0201 07:09:29.618631 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:09:30 crc kubenswrapper[5127]: I0201 07:09:30.247794 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2063c29-7f15-4f1d-a669-c3d2a303bc57" path="/var/lib/kubelet/pods/a2063c29-7f15-4f1d-a669-c3d2a303bc57/volumes" Feb 01 07:09:30 crc kubenswrapper[5127]: I0201 07:09:30.248992 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f" path="/var/lib/kubelet/pods/b9a0d3ec-bfa9-4870-9fa5-0e8fbe32596f/volumes" Feb 01 07:09:30 crc kubenswrapper[5127]: I0201 07:09:30.442832 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cdafa63d-9b24-454c-a217-e53024719e75","Type":"ContainerStarted","Data":"3ef4cc3b13aaf195c0e9ab17d2d878bd41f2a1d4e67c807b8411510d47ddce71"} Feb 01 07:09:30 crc kubenswrapper[5127]: I0201 07:09:30.443197 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cdafa63d-9b24-454c-a217-e53024719e75","Type":"ContainerStarted","Data":"58b78ac0e023f1b9d6bbc86bfb115def8590217eba11dbc9a8f50c2c2e5076d3"} Feb 01 07:09:30 crc kubenswrapper[5127]: I0201 07:09:30.447764 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerStarted","Data":"680b2930f2ceda06409667603beed405d980461a4b3b192a85c6b1a85483f665"} Feb 01 07:09:30 crc kubenswrapper[5127]: I0201 07:09:30.447934 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:09:30 crc kubenswrapper[5127]: I0201 07:09:30.491816 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.172237361 podStartE2EDuration="6.491782937s" podCreationTimestamp="2026-02-01 07:09:24 +0000 UTC" firstStartedPulling="2026-02-01 07:09:25.253312393 +0000 UTC m=+1315.739214756" lastFinishedPulling="2026-02-01 07:09:29.572857959 +0000 UTC m=+1320.058760332" observedRunningTime="2026-02-01 07:09:30.482856616 +0000 UTC m=+1320.968758979" watchObservedRunningTime="2026-02-01 07:09:30.491782937 +0000 UTC m=+1320.977685300" Feb 01 07:09:31 crc kubenswrapper[5127]: I0201 07:09:31.460000 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cdafa63d-9b24-454c-a217-e53024719e75","Type":"ContainerStarted","Data":"c118e21ba6efeaa3a7ba640aedf062451fb2a67b8769dd72709cae85ff970c12"} Feb 01 07:09:31 crc kubenswrapper[5127]: I0201 07:09:31.499944 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.499921717 podStartE2EDuration="3.499921717s" podCreationTimestamp="2026-02-01 07:09:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:31.484544682 +0000 UTC m=+1321.970447065" watchObservedRunningTime="2026-02-01 07:09:31.499921717 +0000 UTC m=+1321.985824120" Feb 01 07:09:31 crc kubenswrapper[5127]: I0201 07:09:31.738946 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-769f857fd8-mc6lf" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.153:9696/\": dial tcp 10.217.0.153:9696: connect: connection refused" Feb 01 07:09:32 crc kubenswrapper[5127]: I0201 07:09:32.476008 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.096984 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.167091 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-jvrtd"] Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.167361 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" podUID="8baadd78-4c6f-4299-bb05-588666f19720" containerName="dnsmasq-dns" containerID="cri-o://e5903e7f1235527332e8b7dd44c14a8ea5204e3d249efe266568d10d0397c7f2" gracePeriod=10 Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.249599 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.289458 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.492189 5127 generic.go:334] "Generic (PLEG): container finished" podID="8baadd78-4c6f-4299-bb05-588666f19720" containerID="e5903e7f1235527332e8b7dd44c14a8ea5204e3d249efe266568d10d0397c7f2" exitCode=0 Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.492612 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" event={"ID":"8baadd78-4c6f-4299-bb05-588666f19720","Type":"ContainerDied","Data":"e5903e7f1235527332e8b7dd44c14a8ea5204e3d249efe266568d10d0397c7f2"} Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.492761 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="cinder-scheduler" containerID="cri-o://abd82f6068c6b870949dbb4a1eb5e0e72b3bf789bd0927a57568701cf1376c0d" gracePeriod=30 Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.492896 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="probe" containerID="cri-o://00623722cdf36847bbb66130f352751cee73adb4f1fe72b6f886f110dbebf240" gracePeriod=30 Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.626015 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.829490 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4662\" (UniqueName: \"kubernetes.io/projected/8baadd78-4c6f-4299-bb05-588666f19720-kube-api-access-h4662\") pod \"8baadd78-4c6f-4299-bb05-588666f19720\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.829535 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-config\") pod \"8baadd78-4c6f-4299-bb05-588666f19720\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.829644 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-svc\") pod \"8baadd78-4c6f-4299-bb05-588666f19720\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.829687 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-swift-storage-0\") pod \"8baadd78-4c6f-4299-bb05-588666f19720\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.829742 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-nb\") pod \"8baadd78-4c6f-4299-bb05-588666f19720\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.829771 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-sb\") pod \"8baadd78-4c6f-4299-bb05-588666f19720\" (UID: \"8baadd78-4c6f-4299-bb05-588666f19720\") " Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.841820 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baadd78-4c6f-4299-bb05-588666f19720-kube-api-access-h4662" (OuterVolumeSpecName: "kube-api-access-h4662") pod "8baadd78-4c6f-4299-bb05-588666f19720" (UID: "8baadd78-4c6f-4299-bb05-588666f19720"). InnerVolumeSpecName "kube-api-access-h4662". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.881946 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8baadd78-4c6f-4299-bb05-588666f19720" (UID: "8baadd78-4c6f-4299-bb05-588666f19720"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.887474 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8baadd78-4c6f-4299-bb05-588666f19720" (UID: "8baadd78-4c6f-4299-bb05-588666f19720"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.887837 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-config" (OuterVolumeSpecName: "config") pod "8baadd78-4c6f-4299-bb05-588666f19720" (UID: "8baadd78-4c6f-4299-bb05-588666f19720"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.888562 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8baadd78-4c6f-4299-bb05-588666f19720" (UID: "8baadd78-4c6f-4299-bb05-588666f19720"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.906108 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8baadd78-4c6f-4299-bb05-588666f19720" (UID: "8baadd78-4c6f-4299-bb05-588666f19720"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.932236 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.932281 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.932293 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4662\" (UniqueName: \"kubernetes.io/projected/8baadd78-4c6f-4299-bb05-588666f19720-kube-api-access-h4662\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.932309 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.932320 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:33 crc kubenswrapper[5127]: I0201 07:09:33.932331 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8baadd78-4c6f-4299-bb05-588666f19720-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.443931 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.534136 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.534925 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-jvrtd" event={"ID":"8baadd78-4c6f-4299-bb05-588666f19720","Type":"ContainerDied","Data":"f4b6f5b09fbc02ebeff21c32e8b0d72f0cfe6cb6b99ca7958d0660ae2c5f66c1"} Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.534956 5127 scope.go:117] "RemoveContainer" containerID="e5903e7f1235527332e8b7dd44c14a8ea5204e3d249efe266568d10d0397c7f2" Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.543441 5127 generic.go:334] "Generic (PLEG): container finished" podID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerID="00623722cdf36847bbb66130f352751cee73adb4f1fe72b6f886f110dbebf240" exitCode=0 Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.543500 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"474c7fb9-fcd9-48aa-9287-d114245f9a63","Type":"ContainerDied","Data":"00623722cdf36847bbb66130f352751cee73adb4f1fe72b6f886f110dbebf240"} Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.569512 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-jvrtd"] Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.580463 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-jvrtd"] Feb 01 07:09:34 crc kubenswrapper[5127]: I0201 07:09:34.591744 5127 scope.go:117] "RemoveContainer" containerID="f6b41e6f76c2670507f1f7149418ea3325709816d5e417716494a1687ad34313" Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.253698 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8baadd78-4c6f-4299-bb05-588666f19720" path="/var/lib/kubelet/pods/8baadd78-4c6f-4299-bb05-588666f19720/volumes" Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.588860 5127 generic.go:334] "Generic (PLEG): container finished" podID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerID="abd82f6068c6b870949dbb4a1eb5e0e72b3bf789bd0927a57568701cf1376c0d" exitCode=0 Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.589151 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"474c7fb9-fcd9-48aa-9287-d114245f9a63","Type":"ContainerDied","Data":"abd82f6068c6b870949dbb4a1eb5e0e72b3bf789bd0927a57568701cf1376c0d"} Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.715151 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.802548 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c76548565-62sx9"] Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.802950 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c76548565-62sx9" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-api" containerID="cri-o://10b465e660e4f2883032c090226b977689b04511bf7c1d7ab8b7d44ff5bb1e77" gracePeriod=30 Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.803117 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c76548565-62sx9" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-httpd" containerID="cri-o://f754d2dd9af31fff605fb93bb5e24240954c7191bee15e280e72a4fafc44bfb2" gracePeriod=30 Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.940907 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.988160 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data\") pod \"474c7fb9-fcd9-48aa-9287-d114245f9a63\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.988200 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fv85\" (UniqueName: \"kubernetes.io/projected/474c7fb9-fcd9-48aa-9287-d114245f9a63-kube-api-access-7fv85\") pod \"474c7fb9-fcd9-48aa-9287-d114245f9a63\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.988222 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-combined-ca-bundle\") pod \"474c7fb9-fcd9-48aa-9287-d114245f9a63\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.988256 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-scripts\") pod \"474c7fb9-fcd9-48aa-9287-d114245f9a63\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.988308 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data-custom\") pod \"474c7fb9-fcd9-48aa-9287-d114245f9a63\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.988330 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/474c7fb9-fcd9-48aa-9287-d114245f9a63-etc-machine-id\") pod \"474c7fb9-fcd9-48aa-9287-d114245f9a63\" (UID: \"474c7fb9-fcd9-48aa-9287-d114245f9a63\") " Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.988673 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/474c7fb9-fcd9-48aa-9287-d114245f9a63-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "474c7fb9-fcd9-48aa-9287-d114245f9a63" (UID: "474c7fb9-fcd9-48aa-9287-d114245f9a63"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:09:36 crc kubenswrapper[5127]: I0201 07:09:36.996136 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474c7fb9-fcd9-48aa-9287-d114245f9a63-kube-api-access-7fv85" (OuterVolumeSpecName: "kube-api-access-7fv85") pod "474c7fb9-fcd9-48aa-9287-d114245f9a63" (UID: "474c7fb9-fcd9-48aa-9287-d114245f9a63"). InnerVolumeSpecName "kube-api-access-7fv85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.003710 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-scripts" (OuterVolumeSpecName: "scripts") pod "474c7fb9-fcd9-48aa-9287-d114245f9a63" (UID: "474c7fb9-fcd9-48aa-9287-d114245f9a63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.004746 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "474c7fb9-fcd9-48aa-9287-d114245f9a63" (UID: "474c7fb9-fcd9-48aa-9287-d114245f9a63"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.073492 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "474c7fb9-fcd9-48aa-9287-d114245f9a63" (UID: "474c7fb9-fcd9-48aa-9287-d114245f9a63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.092497 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.092683 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/474c7fb9-fcd9-48aa-9287-d114245f9a63-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.092695 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fv85\" (UniqueName: \"kubernetes.io/projected/474c7fb9-fcd9-48aa-9287-d114245f9a63-kube-api-access-7fv85\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.092740 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.092753 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.112827 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data" (OuterVolumeSpecName: "config-data") pod "474c7fb9-fcd9-48aa-9287-d114245f9a63" (UID: "474c7fb9-fcd9-48aa-9287-d114245f9a63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.194161 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/474c7fb9-fcd9-48aa-9287-d114245f9a63-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.631896 5127 generic.go:334] "Generic (PLEG): container finished" podID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerID="f754d2dd9af31fff605fb93bb5e24240954c7191bee15e280e72a4fafc44bfb2" exitCode=0 Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.632011 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c76548565-62sx9" event={"ID":"5701395e-85bc-40a9-bff7-f1b452b8e187","Type":"ContainerDied","Data":"f754d2dd9af31fff605fb93bb5e24240954c7191bee15e280e72a4fafc44bfb2"} Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.638116 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"474c7fb9-fcd9-48aa-9287-d114245f9a63","Type":"ContainerDied","Data":"9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312"} Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.638218 5127 scope.go:117] "RemoveContainer" containerID="00623722cdf36847bbb66130f352751cee73adb4f1fe72b6f886f110dbebf240" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.638496 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.674447 5127 scope.go:117] "RemoveContainer" containerID="abd82f6068c6b870949dbb4a1eb5e0e72b3bf789bd0927a57568701cf1376c0d" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.697779 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.713037 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747070 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:37 crc kubenswrapper[5127]: E0201 07:09:37.747449 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="probe" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747466 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="probe" Feb 01 07:09:37 crc kubenswrapper[5127]: E0201 07:09:37.747493 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baadd78-4c6f-4299-bb05-588666f19720" containerName="dnsmasq-dns" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747501 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baadd78-4c6f-4299-bb05-588666f19720" containerName="dnsmasq-dns" Feb 01 07:09:37 crc kubenswrapper[5127]: E0201 07:09:37.747510 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="cinder-scheduler" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747516 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="cinder-scheduler" Feb 01 07:09:37 crc kubenswrapper[5127]: E0201 07:09:37.747527 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baadd78-4c6f-4299-bb05-588666f19720" containerName="init" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747533 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baadd78-4c6f-4299-bb05-588666f19720" containerName="init" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747739 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baadd78-4c6f-4299-bb05-588666f19720" containerName="dnsmasq-dns" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747754 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="probe" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.747770 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" containerName="cinder-scheduler" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.748648 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.752275 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.761048 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.881223 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.904947 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.905001 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck44x\" (UniqueName: \"kubernetes.io/projected/48898154-9be0-400f-8e0b-ef721132db71-kube-api-access-ck44x\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.905373 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.905600 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.905645 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48898154-9be0-400f-8e0b-ef721132db71-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:37 crc kubenswrapper[5127]: I0201 07:09:37.905800 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-scripts\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.007155 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.007208 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck44x\" (UniqueName: \"kubernetes.io/projected/48898154-9be0-400f-8e0b-ef721132db71-kube-api-access-ck44x\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.007357 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.007383 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.007403 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48898154-9be0-400f-8e0b-ef721132db71-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.007439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-scripts\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.007658 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48898154-9be0-400f-8e0b-ef721132db71-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.013179 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-scripts\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.015301 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.026635 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.028248 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.036307 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck44x\" (UniqueName: \"kubernetes.io/projected/48898154-9be0-400f-8e0b-ef721132db71-kube-api-access-ck44x\") pod \"cinder-scheduler-0\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.068941 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.250779 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474c7fb9-fcd9-48aa-9287-d114245f9a63" path="/var/lib/kubelet/pods/474c7fb9-fcd9-48aa-9287-d114245f9a63/volumes" Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.539548 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:09:38 crc kubenswrapper[5127]: I0201 07:09:38.649210 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48898154-9be0-400f-8e0b-ef721132db71","Type":"ContainerStarted","Data":"2398a517b2bc7bf270cfec8577c2094869842f4a692025d3af41f561ab33fc3e"} Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.363271 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-769f857fd8-mc6lf_e87ee524-fbce-45ca-b3fb-e6b59a739f73/neutron-api/0.log" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.363786 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.557243 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-combined-ca-bundle\") pod \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.557312 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-httpd-config\") pod \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.557359 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-ovndb-tls-certs\") pod \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.557419 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-config\") pod \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.557445 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zck9\" (UniqueName: \"kubernetes.io/projected/e87ee524-fbce-45ca-b3fb-e6b59a739f73-kube-api-access-9zck9\") pod \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\" (UID: \"e87ee524-fbce-45ca-b3fb-e6b59a739f73\") " Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.572721 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e87ee524-fbce-45ca-b3fb-e6b59a739f73" (UID: "e87ee524-fbce-45ca-b3fb-e6b59a739f73"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.575747 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87ee524-fbce-45ca-b3fb-e6b59a739f73-kube-api-access-9zck9" (OuterVolumeSpecName: "kube-api-access-9zck9") pod "e87ee524-fbce-45ca-b3fb-e6b59a739f73" (UID: "e87ee524-fbce-45ca-b3fb-e6b59a739f73"). InnerVolumeSpecName "kube-api-access-9zck9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.620694 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-config" (OuterVolumeSpecName: "config") pod "e87ee524-fbce-45ca-b3fb-e6b59a739f73" (UID: "e87ee524-fbce-45ca-b3fb-e6b59a739f73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.661307 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.661511 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.661529 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zck9\" (UniqueName: \"kubernetes.io/projected/e87ee524-fbce-45ca-b3fb-e6b59a739f73-kube-api-access-9zck9\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.662158 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e87ee524-fbce-45ca-b3fb-e6b59a739f73" (UID: "e87ee524-fbce-45ca-b3fb-e6b59a739f73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.681471 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48898154-9be0-400f-8e0b-ef721132db71","Type":"ContainerStarted","Data":"45488eefbe618c6ed70968bb3a79848f397c02da3176113bc9124b98acb538e2"} Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.683550 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e87ee524-fbce-45ca-b3fb-e6b59a739f73" (UID: "e87ee524-fbce-45ca-b3fb-e6b59a739f73"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.684308 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-769f857fd8-mc6lf_e87ee524-fbce-45ca-b3fb-e6b59a739f73/neutron-api/0.log" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.684376 5127 generic.go:334] "Generic (PLEG): container finished" podID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerID="c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4" exitCode=137 Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.684429 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-769f857fd8-mc6lf" event={"ID":"e87ee524-fbce-45ca-b3fb-e6b59a739f73","Type":"ContainerDied","Data":"c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4"} Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.684453 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-769f857fd8-mc6lf" event={"ID":"e87ee524-fbce-45ca-b3fb-e6b59a739f73","Type":"ContainerDied","Data":"56f6ff2e1bf999e0c3761e33c152589bf39514fef880cd4270f186f01001aa6b"} Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.684471 5127 scope.go:117] "RemoveContainer" containerID="dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.684669 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-769f857fd8-mc6lf" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.763055 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.763080 5127 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e87ee524-fbce-45ca-b3fb-e6b59a739f73-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.790819 5127 scope.go:117] "RemoveContainer" containerID="c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.810063 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-769f857fd8-mc6lf"] Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.819687 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-769f857fd8-mc6lf"] Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.821429 5127 scope.go:117] "RemoveContainer" containerID="dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa" Feb 01 07:09:39 crc kubenswrapper[5127]: E0201 07:09:39.823755 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa\": container with ID starting with dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa not found: ID does not exist" containerID="dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.823864 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa"} err="failed to get container status \"dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa\": rpc error: code = NotFound desc = could not find container \"dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa\": container with ID starting with dd4174a02e18e3699d8814311fd52f2d9720c4787ece717df0ef4952501ce9aa not found: ID does not exist" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.823895 5127 scope.go:117] "RemoveContainer" containerID="c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4" Feb 01 07:09:39 crc kubenswrapper[5127]: E0201 07:09:39.824157 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4\": container with ID starting with c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4 not found: ID does not exist" containerID="c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.824184 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4"} err="failed to get container status \"c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4\": rpc error: code = NotFound desc = could not find container \"c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4\": container with ID starting with c1aacde2ecdf1f90f319019b1f5642f04207f1232f3b6793b65a7cd2852dcbd4 not found: ID does not exist" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.840187 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 01 07:09:39 crc kubenswrapper[5127]: E0201 07:09:39.840527 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-api" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.840538 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-api" Feb 01 07:09:39 crc kubenswrapper[5127]: E0201 07:09:39.840557 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-httpd" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.840563 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-httpd" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.840745 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-httpd" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.840758 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" containerName="neutron-api" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.841283 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.848212 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.848379 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-dxft5" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.848497 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.854791 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.865162 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.865236 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lmx\" (UniqueName: \"kubernetes.io/projected/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-kube-api-access-26lmx\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.865291 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.865313 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.969752 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.969809 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.969915 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.969974 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26lmx\" (UniqueName: \"kubernetes.io/projected/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-kube-api-access-26lmx\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.972019 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.975112 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config-secret\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:39 crc kubenswrapper[5127]: I0201 07:09:39.976676 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.006356 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26lmx\" (UniqueName: \"kubernetes.io/projected/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-kube-api-access-26lmx\") pod \"openstackclient\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " pod="openstack/openstackclient" Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.166151 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.252309 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87ee524-fbce-45ca-b3fb-e6b59a739f73" path="/var/lib/kubelet/pods/e87ee524-fbce-45ca-b3fb-e6b59a739f73/volumes" Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.370621 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.372249 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.448478 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-785b58c67b-rrzfw"] Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.454958 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-785b58c67b-rrzfw" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-log" containerID="cri-o://d9d494912fd4bed9d4a8e96ace41f4839e6099c8b86355e161b702c93bc5920a" gracePeriod=30 Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.455349 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-785b58c67b-rrzfw" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-api" containerID="cri-o://59caa9defd7b25237630d39d27038f3e0a8a5e123e7f87d19dcdcc603c61f215" gracePeriod=30 Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.669850 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.712515 5127 generic.go:334] "Generic (PLEG): container finished" podID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerID="d9d494912fd4bed9d4a8e96ace41f4839e6099c8b86355e161b702c93bc5920a" exitCode=143 Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.712772 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-785b58c67b-rrzfw" event={"ID":"0da4cb95-6224-41e2-9adc-4d0d56a0c162","Type":"ContainerDied","Data":"d9d494912fd4bed9d4a8e96ace41f4839e6099c8b86355e161b702c93bc5920a"} Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.714190 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48898154-9be0-400f-8e0b-ef721132db71","Type":"ContainerStarted","Data":"0be8d4cb9574063f87962b5663f7c99862b6167cbe906b2f8987098ff021beff"} Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.721795 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a15e38c1-f8c8-4e6c-9e52-1b39e952017d","Type":"ContainerStarted","Data":"73a78e4e027006c2beb27f378db1b1c7ff4cf5ddefe6df32d6f3af9f9f0064c0"} Feb 01 07:09:40 crc kubenswrapper[5127]: I0201 07:09:40.732355 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.732338786 podStartE2EDuration="3.732338786s" podCreationTimestamp="2026-02-01 07:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:40.731820102 +0000 UTC m=+1331.217722475" watchObservedRunningTime="2026-02-01 07:09:40.732338786 +0000 UTC m=+1331.218241139" Feb 01 07:09:41 crc kubenswrapper[5127]: I0201 07:09:41.985645 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.069650 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.751097 5127 generic.go:334] "Generic (PLEG): container finished" podID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerID="59caa9defd7b25237630d39d27038f3e0a8a5e123e7f87d19dcdcc603c61f215" exitCode=0 Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.751136 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-785b58c67b-rrzfw" event={"ID":"0da4cb95-6224-41e2-9adc-4d0d56a0c162","Type":"ContainerDied","Data":"59caa9defd7b25237630d39d27038f3e0a8a5e123e7f87d19dcdcc603c61f215"} Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.869873 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7f7d4bc459-g6tgf"] Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.874780 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.878952 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.879355 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.879488 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.886683 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f7d4bc459-g6tgf"] Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.945422 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-internal-tls-certs\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.945489 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-public-tls-certs\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.945551 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-etc-swift\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.945644 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-run-httpd\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.945705 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q8w8\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-kube-api-access-4q8w8\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.945746 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-combined-ca-bundle\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.945790 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-config-data\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:43 crc kubenswrapper[5127]: I0201 07:09:43.946015 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-log-httpd\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.043779 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046458 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-scripts\") pod \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046547 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-internal-tls-certs\") pod \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046569 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzhpv\" (UniqueName: \"kubernetes.io/projected/0da4cb95-6224-41e2-9adc-4d0d56a0c162-kube-api-access-gzhpv\") pod \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046634 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-config-data\") pod \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046712 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-public-tls-certs\") pod \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046736 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0da4cb95-6224-41e2-9adc-4d0d56a0c162-logs\") pod \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046809 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-combined-ca-bundle\") pod \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\" (UID: \"0da4cb95-6224-41e2-9adc-4d0d56a0c162\") " Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.046967 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-public-tls-certs\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.047020 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-etc-swift\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.047042 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-run-httpd\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.047104 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q8w8\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-kube-api-access-4q8w8\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.047135 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-combined-ca-bundle\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.047186 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-config-data\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.047251 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-log-httpd\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.047317 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-internal-tls-certs\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.056160 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-run-httpd\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.056408 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da4cb95-6224-41e2-9adc-4d0d56a0c162-logs" (OuterVolumeSpecName: "logs") pod "0da4cb95-6224-41e2-9adc-4d0d56a0c162" (UID: "0da4cb95-6224-41e2-9adc-4d0d56a0c162"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.056775 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-scripts" (OuterVolumeSpecName: "scripts") pod "0da4cb95-6224-41e2-9adc-4d0d56a0c162" (UID: "0da4cb95-6224-41e2-9adc-4d0d56a0c162"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.060153 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-log-httpd\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.062174 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da4cb95-6224-41e2-9adc-4d0d56a0c162-kube-api-access-gzhpv" (OuterVolumeSpecName: "kube-api-access-gzhpv") pod "0da4cb95-6224-41e2-9adc-4d0d56a0c162" (UID: "0da4cb95-6224-41e2-9adc-4d0d56a0c162"). InnerVolumeSpecName "kube-api-access-gzhpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.064929 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-internal-tls-certs\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.066218 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-public-tls-certs\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.073171 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-config-data\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.078554 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-combined-ca-bundle\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.093100 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q8w8\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-kube-api-access-4q8w8\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.101757 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-etc-swift\") pod \"swift-proxy-7f7d4bc459-g6tgf\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.153853 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0da4cb95-6224-41e2-9adc-4d0d56a0c162-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.153883 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.153892 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzhpv\" (UniqueName: \"kubernetes.io/projected/0da4cb95-6224-41e2-9adc-4d0d56a0c162-kube-api-access-gzhpv\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.156509 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0da4cb95-6224-41e2-9adc-4d0d56a0c162" (UID: "0da4cb95-6224-41e2-9adc-4d0d56a0c162"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.187768 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-config-data" (OuterVolumeSpecName: "config-data") pod "0da4cb95-6224-41e2-9adc-4d0d56a0c162" (UID: "0da4cb95-6224-41e2-9adc-4d0d56a0c162"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.210121 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.260345 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.260373 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.302747 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0da4cb95-6224-41e2-9adc-4d0d56a0c162" (UID: "0da4cb95-6224-41e2-9adc-4d0d56a0c162"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.325777 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0da4cb95-6224-41e2-9adc-4d0d56a0c162" (UID: "0da4cb95-6224-41e2-9adc-4d0d56a0c162"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.365056 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.365329 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0da4cb95-6224-41e2-9adc-4d0d56a0c162-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.768676 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.769151 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-central-agent" containerID="cri-o://057a4d574e0ec1b0de1199d96913ec4dc9c37bd1fdd9c99e4893cf74f9fe7fa8" gracePeriod=30 Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.769282 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-notification-agent" containerID="cri-o://273a06a7fd0d012d844bcca367a3156b15003fceeeae2aa571bee83db57e01ce" gracePeriod=30 Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.769414 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="sg-core" containerID="cri-o://3ff311aebfb00ef708f5a11381da08386c244e4de1fc53799498ee06df024461" gracePeriod=30 Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.769525 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="proxy-httpd" containerID="cri-o://680b2930f2ceda06409667603beed405d980461a4b3b192a85c6b1a85483f665" gracePeriod=30 Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.772815 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-785b58c67b-rrzfw" event={"ID":"0da4cb95-6224-41e2-9adc-4d0d56a0c162","Type":"ContainerDied","Data":"9c39754222c5e421fee197611eeee3c1dfecc6368f341ae2cba990656bd7fdea"} Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.772861 5127 scope.go:117] "RemoveContainer" containerID="59caa9defd7b25237630d39d27038f3e0a8a5e123e7f87d19dcdcc603c61f215" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.773020 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-785b58c67b-rrzfw" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.780717 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.170:3000/\": EOF" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.810700 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-785b58c67b-rrzfw"] Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.822964 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-785b58c67b-rrzfw"] Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.849658 5127 scope.go:117] "RemoveContainer" containerID="d9d494912fd4bed9d4a8e96ace41f4839e6099c8b86355e161b702c93bc5920a" Feb 01 07:09:44 crc kubenswrapper[5127]: I0201 07:09:44.937624 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f7d4bc459-g6tgf"] Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.787113 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" event={"ID":"38d5ee07-f2ba-4a01-abab-aa8a58056a1b","Type":"ContainerStarted","Data":"f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1"} Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.787417 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" event={"ID":"38d5ee07-f2ba-4a01-abab-aa8a58056a1b","Type":"ContainerStarted","Data":"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7"} Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.787432 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" event={"ID":"38d5ee07-f2ba-4a01-abab-aa8a58056a1b","Type":"ContainerStarted","Data":"a6fb044fc6a7651321acf6a9de2d7857d5c8415758dd2c063b3396dab494fcc5"} Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.788289 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.788303 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.792336 5127 generic.go:334] "Generic (PLEG): container finished" podID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerID="680b2930f2ceda06409667603beed405d980461a4b3b192a85c6b1a85483f665" exitCode=0 Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.792361 5127 generic.go:334] "Generic (PLEG): container finished" podID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerID="3ff311aebfb00ef708f5a11381da08386c244e4de1fc53799498ee06df024461" exitCode=2 Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.792370 5127 generic.go:334] "Generic (PLEG): container finished" podID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerID="057a4d574e0ec1b0de1199d96913ec4dc9c37bd1fdd9c99e4893cf74f9fe7fa8" exitCode=0 Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.792389 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerDied","Data":"680b2930f2ceda06409667603beed405d980461a4b3b192a85c6b1a85483f665"} Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.792412 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerDied","Data":"3ff311aebfb00ef708f5a11381da08386c244e4de1fc53799498ee06df024461"} Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.792424 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerDied","Data":"057a4d574e0ec1b0de1199d96913ec4dc9c37bd1fdd9c99e4893cf74f9fe7fa8"} Feb 01 07:09:45 crc kubenswrapper[5127]: I0201 07:09:45.812465 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" podStartSLOduration=2.8124450789999997 podStartE2EDuration="2.812445079s" podCreationTimestamp="2026-02-01 07:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:45.804385981 +0000 UTC m=+1336.290288344" watchObservedRunningTime="2026-02-01 07:09:45.812445079 +0000 UTC m=+1336.298347432" Feb 01 07:09:46 crc kubenswrapper[5127]: I0201 07:09:46.249407 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" path="/var/lib/kubelet/pods/0da4cb95-6224-41e2-9adc-4d0d56a0c162/volumes" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.609998 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-c8jq7"] Feb 01 07:09:47 crc kubenswrapper[5127]: E0201 07:09:47.610640 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-log" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.610654 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-log" Feb 01 07:09:47 crc kubenswrapper[5127]: E0201 07:09:47.610682 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-api" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.610688 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-api" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.610841 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-log" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.610862 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0da4cb95-6224-41e2-9adc-4d0d56a0c162" containerName="placement-api" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.611425 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.632660 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c8jq7"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.708471 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-sflr2"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.713561 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.735280 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-operator-scripts\") pod \"nova-api-db-create-c8jq7\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.735335 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6mh5\" (UniqueName: \"kubernetes.io/projected/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-kube-api-access-z6mh5\") pod \"nova-api-db-create-c8jq7\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.737934 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-bb8a-account-create-update-l8j7b"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.739389 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.749972 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.767069 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sflr2"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.805750 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-l8j7b"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.836873 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-operator-scripts\") pod \"nova-api-db-create-c8jq7\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.836923 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6mh5\" (UniqueName: \"kubernetes.io/projected/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-kube-api-access-z6mh5\") pod \"nova-api-db-create-c8jq7\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.836969 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-operator-scripts\") pod \"nova-cell0-db-create-sflr2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.837010 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/657c0b79-3594-4a70-a7de-6152741e8148-operator-scripts\") pod \"nova-api-bb8a-account-create-update-l8j7b\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.837059 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x4tk\" (UniqueName: \"kubernetes.io/projected/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-kube-api-access-2x4tk\") pod \"nova-cell0-db-create-sflr2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.837104 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmks\" (UniqueName: \"kubernetes.io/projected/657c0b79-3594-4a70-a7de-6152741e8148-kube-api-access-flmks\") pod \"nova-api-bb8a-account-create-update-l8j7b\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.837872 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-operator-scripts\") pod \"nova-api-db-create-c8jq7\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.839198 5127 generic.go:334] "Generic (PLEG): container finished" podID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerID="10b465e660e4f2883032c090226b977689b04511bf7c1d7ab8b7d44ff5bb1e77" exitCode=0 Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.839229 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c76548565-62sx9" event={"ID":"5701395e-85bc-40a9-bff7-f1b452b8e187","Type":"ContainerDied","Data":"10b465e660e4f2883032c090226b977689b04511bf7c1d7ab8b7d44ff5bb1e77"} Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.840707 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nn8d9"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.842142 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.865207 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6mh5\" (UniqueName: \"kubernetes.io/projected/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-kube-api-access-z6mh5\") pod \"nova-api-db-create-c8jq7\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.865394 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nn8d9"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.911946 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4519-account-create-update-w7f6k"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.914454 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.917879 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.921122 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-w7f6k"] Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.927943 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.938748 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flmks\" (UniqueName: \"kubernetes.io/projected/657c0b79-3594-4a70-a7de-6152741e8148-kube-api-access-flmks\") pod \"nova-api-bb8a-account-create-update-l8j7b\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.939008 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdsrn\" (UniqueName: \"kubernetes.io/projected/254e67ea-20e8-4960-ae74-c4d1bff0369a-kube-api-access-zdsrn\") pod \"nova-cell1-db-create-nn8d9\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.939089 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-operator-scripts\") pod \"nova-cell0-db-create-sflr2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.939161 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/254e67ea-20e8-4960-ae74-c4d1bff0369a-operator-scripts\") pod \"nova-cell1-db-create-nn8d9\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.939197 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/657c0b79-3594-4a70-a7de-6152741e8148-operator-scripts\") pod \"nova-api-bb8a-account-create-update-l8j7b\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.939262 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x4tk\" (UniqueName: \"kubernetes.io/projected/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-kube-api-access-2x4tk\") pod \"nova-cell0-db-create-sflr2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.940359 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-operator-scripts\") pod \"nova-cell0-db-create-sflr2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.940885 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/657c0b79-3594-4a70-a7de-6152741e8148-operator-scripts\") pod \"nova-api-bb8a-account-create-update-l8j7b\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.962286 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x4tk\" (UniqueName: \"kubernetes.io/projected/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-kube-api-access-2x4tk\") pod \"nova-cell0-db-create-sflr2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:47 crc kubenswrapper[5127]: I0201 07:09:47.962336 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmks\" (UniqueName: \"kubernetes.io/projected/657c0b79-3594-4a70-a7de-6152741e8148-kube-api-access-flmks\") pod \"nova-api-bb8a-account-create-update-l8j7b\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.040710 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jd4g\" (UniqueName: \"kubernetes.io/projected/c9af14ed-135f-45d2-9aca-55513eb0e860-kube-api-access-4jd4g\") pod \"nova-cell0-4519-account-create-update-w7f6k\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.040997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9af14ed-135f-45d2-9aca-55513eb0e860-operator-scripts\") pod \"nova-cell0-4519-account-create-update-w7f6k\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.041076 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdsrn\" (UniqueName: \"kubernetes.io/projected/254e67ea-20e8-4960-ae74-c4d1bff0369a-kube-api-access-zdsrn\") pod \"nova-cell1-db-create-nn8d9\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.041323 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/254e67ea-20e8-4960-ae74-c4d1bff0369a-operator-scripts\") pod \"nova-cell1-db-create-nn8d9\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.042188 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/254e67ea-20e8-4960-ae74-c4d1bff0369a-operator-scripts\") pod \"nova-cell1-db-create-nn8d9\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.051573 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.056278 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdsrn\" (UniqueName: \"kubernetes.io/projected/254e67ea-20e8-4960-ae74-c4d1bff0369a-kube-api-access-zdsrn\") pod \"nova-cell1-db-create-nn8d9\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.099899 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.110303 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9eab-account-create-update-f4jqw"] Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.115502 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.133044 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.143903 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9af14ed-135f-45d2-9aca-55513eb0e860-operator-scripts\") pod \"nova-cell0-4519-account-create-update-w7f6k\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.144060 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jd4g\" (UniqueName: \"kubernetes.io/projected/c9af14ed-135f-45d2-9aca-55513eb0e860-kube-api-access-4jd4g\") pod \"nova-cell0-4519-account-create-update-w7f6k\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.147156 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9af14ed-135f-45d2-9aca-55513eb0e860-operator-scripts\") pod \"nova-cell0-4519-account-create-update-w7f6k\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.152275 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9eab-account-create-update-f4jqw"] Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.185142 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.191849 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jd4g\" (UniqueName: \"kubernetes.io/projected/c9af14ed-135f-45d2-9aca-55513eb0e860-kube-api-access-4jd4g\") pod \"nova-cell0-4519-account-create-update-w7f6k\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.243099 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.248678 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a4da05-deae-4395-a91b-b8ddfb804f8a-operator-scripts\") pod \"nova-cell1-9eab-account-create-update-f4jqw\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.248744 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6kt\" (UniqueName: \"kubernetes.io/projected/c6a4da05-deae-4395-a91b-b8ddfb804f8a-kube-api-access-zv6kt\") pod \"nova-cell1-9eab-account-create-update-f4jqw\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.351186 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a4da05-deae-4395-a91b-b8ddfb804f8a-operator-scripts\") pod \"nova-cell1-9eab-account-create-update-f4jqw\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.351256 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6kt\" (UniqueName: \"kubernetes.io/projected/c6a4da05-deae-4395-a91b-b8ddfb804f8a-kube-api-access-zv6kt\") pod \"nova-cell1-9eab-account-create-update-f4jqw\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.352004 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a4da05-deae-4395-a91b-b8ddfb804f8a-operator-scripts\") pod \"nova-cell1-9eab-account-create-update-f4jqw\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.365916 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.370873 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6kt\" (UniqueName: \"kubernetes.io/projected/c6a4da05-deae-4395-a91b-b8ddfb804f8a-kube-api-access-zv6kt\") pod \"nova-cell1-9eab-account-create-update-f4jqw\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.461600 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.857418 5127 generic.go:334] "Generic (PLEG): container finished" podID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerID="273a06a7fd0d012d844bcca367a3156b15003fceeeae2aa571bee83db57e01ce" exitCode=0 Feb 01 07:09:48 crc kubenswrapper[5127]: I0201 07:09:48.857470 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerDied","Data":"273a06a7fd0d012d844bcca367a3156b15003fceeeae2aa571bee83db57e01ce"} Feb 01 07:09:49 crc kubenswrapper[5127]: E0201 07:09:49.077140 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474c7fb9_fcd9_48aa_9287_d114245f9a63.slice/crio-9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312\": RecentStats: unable to find data in memory cache]" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.626498 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.732237 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-log-httpd\") pod \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.732619 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-config-data\") pod \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.732687 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-run-httpd\") pod \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.732720 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-combined-ca-bundle\") pod \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.732755 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-sg-core-conf-yaml\") pod \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.732802 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2df6\" (UniqueName: \"kubernetes.io/projected/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-kube-api-access-n2df6\") pod \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.732835 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-scripts\") pod \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\" (UID: \"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.736164 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" (UID: "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.736566 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" (UID: "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.757501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-kube-api-access-n2df6" (OuterVolumeSpecName: "kube-api-access-n2df6") pod "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" (UID: "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51"). InnerVolumeSpecName "kube-api-access-n2df6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.760513 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-scripts" (OuterVolumeSpecName: "scripts") pod "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" (UID: "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.783361 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" (UID: "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.819055 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.834854 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.834883 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.834892 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.834902 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2df6\" (UniqueName: \"kubernetes.io/projected/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-kube-api-access-n2df6\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.834912 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.851379 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" (UID: "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.893891 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-config-data" (OuterVolumeSpecName: "config-data") pod "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" (UID: "d772ca91-f1a3-49e0-9fe2-7381a0ffcc51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.896491 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c76548565-62sx9" event={"ID":"5701395e-85bc-40a9-bff7-f1b452b8e187","Type":"ContainerDied","Data":"eef349efd9f8afc4c6c60ee85d4c4d3831137e5254d4a5aea09e472c2aed68f6"} Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.896540 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c76548565-62sx9" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.896559 5127 scope.go:117] "RemoveContainer" containerID="f754d2dd9af31fff605fb93bb5e24240954c7191bee15e280e72a4fafc44bfb2" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.900420 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d772ca91-f1a3-49e0-9fe2-7381a0ffcc51","Type":"ContainerDied","Data":"8e9c9aa97c763483ffc2ec47571bc968faf3f2bdd45807b97205ecb9ce7e2d5d"} Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.900505 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.905095 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a15e38c1-f8c8-4e6c-9e52-1b39e952017d","Type":"ContainerStarted","Data":"4acd4b5b4ff519a2d04a0bd77806acea282f43ce9562e95499145235dc585912"} Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.917061 5127 scope.go:117] "RemoveContainer" containerID="10b465e660e4f2883032c090226b977689b04511bf7c1d7ab8b7d44ff5bb1e77" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.932912 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.324583014 podStartE2EDuration="13.932892642s" podCreationTimestamp="2026-02-01 07:09:39 +0000 UTC" firstStartedPulling="2026-02-01 07:09:40.677876994 +0000 UTC m=+1331.163779347" lastFinishedPulling="2026-02-01 07:09:52.286186622 +0000 UTC m=+1342.772088975" observedRunningTime="2026-02-01 07:09:52.927134836 +0000 UTC m=+1343.413037199" watchObservedRunningTime="2026-02-01 07:09:52.932892642 +0000 UTC m=+1343.418795005" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.936407 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-combined-ca-bundle\") pod \"5701395e-85bc-40a9-bff7-f1b452b8e187\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.936501 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-httpd-config\") pod \"5701395e-85bc-40a9-bff7-f1b452b8e187\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.936566 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-internal-tls-certs\") pod \"5701395e-85bc-40a9-bff7-f1b452b8e187\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.936667 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-ovndb-tls-certs\") pod \"5701395e-85bc-40a9-bff7-f1b452b8e187\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.936706 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-config\") pod \"5701395e-85bc-40a9-bff7-f1b452b8e187\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.936729 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-public-tls-certs\") pod \"5701395e-85bc-40a9-bff7-f1b452b8e187\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.936749 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txrss\" (UniqueName: \"kubernetes.io/projected/5701395e-85bc-40a9-bff7-f1b452b8e187-kube-api-access-txrss\") pod \"5701395e-85bc-40a9-bff7-f1b452b8e187\" (UID: \"5701395e-85bc-40a9-bff7-f1b452b8e187\") " Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.937122 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.937135 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.959535 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5701395e-85bc-40a9-bff7-f1b452b8e187-kube-api-access-txrss" (OuterVolumeSpecName: "kube-api-access-txrss") pod "5701395e-85bc-40a9-bff7-f1b452b8e187" (UID: "5701395e-85bc-40a9-bff7-f1b452b8e187"). InnerVolumeSpecName "kube-api-access-txrss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.960382 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5701395e-85bc-40a9-bff7-f1b452b8e187" (UID: "5701395e-85bc-40a9-bff7-f1b452b8e187"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.977865 5127 scope.go:117] "RemoveContainer" containerID="680b2930f2ceda06409667603beed405d980461a4b3b192a85c6b1a85483f665" Feb 01 07:09:52 crc kubenswrapper[5127]: I0201 07:09:52.991019 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.010625 5127 scope.go:117] "RemoveContainer" containerID="3ff311aebfb00ef708f5a11381da08386c244e4de1fc53799498ee06df024461" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.031455 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.038539 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.038567 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txrss\" (UniqueName: \"kubernetes.io/projected/5701395e-85bc-40a9-bff7-f1b452b8e187-kube-api-access-txrss\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.040489 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-config" (OuterVolumeSpecName: "config") pod "5701395e-85bc-40a9-bff7-f1b452b8e187" (UID: "5701395e-85bc-40a9-bff7-f1b452b8e187"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.050618 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:53 crc kubenswrapper[5127]: E0201 07:09:53.051135 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="sg-core" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051155 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="sg-core" Feb 01 07:09:53 crc kubenswrapper[5127]: E0201 07:09:53.051180 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-notification-agent" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051188 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-notification-agent" Feb 01 07:09:53 crc kubenswrapper[5127]: E0201 07:09:53.051202 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-api" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051211 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-api" Feb 01 07:09:53 crc kubenswrapper[5127]: E0201 07:09:53.051233 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="proxy-httpd" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051240 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="proxy-httpd" Feb 01 07:09:53 crc kubenswrapper[5127]: E0201 07:09:53.051258 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-httpd" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051266 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-httpd" Feb 01 07:09:53 crc kubenswrapper[5127]: E0201 07:09:53.051278 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-central-agent" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051285 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-central-agent" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051496 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-central-agent" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051513 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-api" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051532 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="proxy-httpd" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051553 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" containerName="neutron-httpd" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051562 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="sg-core" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.051575 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" containerName="ceilometer-notification-agent" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.053659 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.056540 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.057654 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.064381 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5701395e-85bc-40a9-bff7-f1b452b8e187" (UID: "5701395e-85bc-40a9-bff7-f1b452b8e187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.072142 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5701395e-85bc-40a9-bff7-f1b452b8e187" (UID: "5701395e-85bc-40a9-bff7-f1b452b8e187"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.087821 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5701395e-85bc-40a9-bff7-f1b452b8e187" (UID: "5701395e-85bc-40a9-bff7-f1b452b8e187"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.091869 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.103172 5127 scope.go:117] "RemoveContainer" containerID="273a06a7fd0d012d844bcca367a3156b15003fceeeae2aa571bee83db57e01ce" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.106875 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c8jq7"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.106878 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5701395e-85bc-40a9-bff7-f1b452b8e187" (UID: "5701395e-85bc-40a9-bff7-f1b452b8e187"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.140733 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-config-data\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.140820 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-run-httpd\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.140856 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-log-httpd\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.140915 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.140958 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-scripts\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.140983 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjsnh\" (UniqueName: \"kubernetes.io/projected/7e5c4c92-911f-4112-853f-1c398c0c0bbc-kube-api-access-fjsnh\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.141002 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.141062 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.141074 5127 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.141083 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.141092 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.141100 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5701395e-85bc-40a9-bff7-f1b452b8e187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.149893 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9eab-account-create-update-f4jqw"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.171370 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sflr2"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.192843 5127 scope.go:117] "RemoveContainer" containerID="057a4d574e0ec1b0de1199d96913ec4dc9c37bd1fdd9c99e4893cf74f9fe7fa8" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.243178 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.243261 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-scripts\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.243287 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjsnh\" (UniqueName: \"kubernetes.io/projected/7e5c4c92-911f-4112-853f-1c398c0c0bbc-kube-api-access-fjsnh\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.243310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.243433 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-config-data\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.243480 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-run-httpd\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.243513 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-log-httpd\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.244063 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-log-httpd\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.246389 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-run-httpd\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.250786 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.252555 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-config-data\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.254290 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.254444 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-w7f6k"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.257029 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-scripts\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.268551 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjsnh\" (UniqueName: \"kubernetes.io/projected/7e5c4c92-911f-4112-853f-1c398c0c0bbc-kube-api-access-fjsnh\") pod \"ceilometer-0\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.381943 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.397765 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nn8d9"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.416511 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c76548565-62sx9"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.424768 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c76548565-62sx9"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.433089 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-l8j7b"] Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.932075 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" event={"ID":"657c0b79-3594-4a70-a7de-6152741e8148","Type":"ContainerStarted","Data":"61e718b5841a9da16e8cc4920a23aae60828b65b752e22e97dd3965261d101ac"} Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.932456 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" event={"ID":"657c0b79-3594-4a70-a7de-6152741e8148","Type":"ContainerStarted","Data":"a0436199ace60b9ec5b144b916c8aea1a30125b3324b1a46672b143b298947be"} Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.953135 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c8jq7" event={"ID":"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1","Type":"ContainerStarted","Data":"f98afb71f045a45fd856bbc1a5357077f7ba521a2f87d844ed202ca981f7c708"} Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.953193 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c8jq7" event={"ID":"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1","Type":"ContainerStarted","Data":"24b4a04fb3e8c55c12117001e1e106cac1ed88e6210c89b5804f616efd032a25"} Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.964844 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" podStartSLOduration=6.964823664 podStartE2EDuration="6.964823664s" podCreationTimestamp="2026-02-01 07:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:53.956799528 +0000 UTC m=+1344.442701891" watchObservedRunningTime="2026-02-01 07:09:53.964823664 +0000 UTC m=+1344.450726037" Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.986889 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn8d9" event={"ID":"254e67ea-20e8-4960-ae74-c4d1bff0369a","Type":"ContainerStarted","Data":"765723f0588cfac569cb1d2b34aaaf61f0d10551bffade767398b9e84a692d76"} Feb 01 07:09:53 crc kubenswrapper[5127]: I0201 07:09:53.986939 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn8d9" event={"ID":"254e67ea-20e8-4960-ae74-c4d1bff0369a","Type":"ContainerStarted","Data":"a47db05571f6d695eb9ddb92e74165997f15b61fafd17eef617c01e92aaa91da"} Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.022742 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.027755 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" event={"ID":"c6a4da05-deae-4395-a91b-b8ddfb804f8a","Type":"ContainerStarted","Data":"83c1670061335e3a90879e2fc206a129d30c726e80c1076a312eddd7f881625a"} Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.027812 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" event={"ID":"c6a4da05-deae-4395-a91b-b8ddfb804f8a","Type":"ContainerStarted","Data":"e98b35686a127e9d9df2313904d81dd9fb53d395135391a2267dd9565d3735a8"} Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.033392 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-c8jq7" podStartSLOduration=7.033369748 podStartE2EDuration="7.033369748s" podCreationTimestamp="2026-02-01 07:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:53.977041915 +0000 UTC m=+1344.462944278" watchObservedRunningTime="2026-02-01 07:09:54.033369748 +0000 UTC m=+1344.519272121" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.039011 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" event={"ID":"c9af14ed-135f-45d2-9aca-55513eb0e860","Type":"ContainerStarted","Data":"1b5a0ec763bf580aef8e9f8e8d7be9087c396c7cc6f58d1fd6b7ab61ac9d9f28"} Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.039061 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" event={"ID":"c9af14ed-135f-45d2-9aca-55513eb0e860","Type":"ContainerStarted","Data":"f2d6d238e63acd717ec345e19d07e9963e66293add822c8643e4cd73790d94af"} Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.050121 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sflr2" event={"ID":"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2","Type":"ContainerStarted","Data":"00489ed369f9be9fa5ae5086922bd8bea26af0611b892a20c80a5e1db18c8328"} Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.050171 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sflr2" event={"ID":"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2","Type":"ContainerStarted","Data":"0f548475089f39224d3f8e981cc2cd9baac946d3eed77368c043bbbe0e6f4c54"} Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.057089 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-nn8d9" podStartSLOduration=7.057072018 podStartE2EDuration="7.057072018s" podCreationTimestamp="2026-02-01 07:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:54.013612044 +0000 UTC m=+1344.499514407" watchObservedRunningTime="2026-02-01 07:09:54.057072018 +0000 UTC m=+1344.542974381" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.068812 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" podStartSLOduration=6.068793706 podStartE2EDuration="6.068793706s" podCreationTimestamp="2026-02-01 07:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:54.054091677 +0000 UTC m=+1344.539994060" watchObservedRunningTime="2026-02-01 07:09:54.068793706 +0000 UTC m=+1344.554696069" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.082882 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" podStartSLOduration=7.082859506 podStartE2EDuration="7.082859506s" podCreationTimestamp="2026-02-01 07:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:54.068513468 +0000 UTC m=+1344.554415841" watchObservedRunningTime="2026-02-01 07:09:54.082859506 +0000 UTC m=+1344.568761869" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.107028 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-sflr2" podStartSLOduration=7.107005608 podStartE2EDuration="7.107005608s" podCreationTimestamp="2026-02-01 07:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:09:54.098791356 +0000 UTC m=+1344.584693719" watchObservedRunningTime="2026-02-01 07:09:54.107005608 +0000 UTC m=+1344.592907971" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.221645 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.221719 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.259219 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5701395e-85bc-40a9-bff7-f1b452b8e187" path="/var/lib/kubelet/pods/5701395e-85bc-40a9-bff7-f1b452b8e187/volumes" Feb 01 07:09:54 crc kubenswrapper[5127]: I0201 07:09:54.260174 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d772ca91-f1a3-49e0-9fe2-7381a0ffcc51" path="/var/lib/kubelet/pods/d772ca91-f1a3-49e0-9fe2-7381a0ffcc51/volumes" Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.063833 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerStarted","Data":"d76f7a36806114208117c1ee1d0b06461621b41baf8155010c177a528a50ed1f"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.064545 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerStarted","Data":"5b8c34145583682d76595862846717720fac9940d474d5a5a04b75cf340b9e89"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.067367 5127 generic.go:334] "Generic (PLEG): container finished" podID="c6a4da05-deae-4395-a91b-b8ddfb804f8a" containerID="83c1670061335e3a90879e2fc206a129d30c726e80c1076a312eddd7f881625a" exitCode=0 Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.067461 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" event={"ID":"c6a4da05-deae-4395-a91b-b8ddfb804f8a","Type":"ContainerDied","Data":"83c1670061335e3a90879e2fc206a129d30c726e80c1076a312eddd7f881625a"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.070785 5127 generic.go:334] "Generic (PLEG): container finished" podID="c9af14ed-135f-45d2-9aca-55513eb0e860" containerID="1b5a0ec763bf580aef8e9f8e8d7be9087c396c7cc6f58d1fd6b7ab61ac9d9f28" exitCode=0 Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.070825 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" event={"ID":"c9af14ed-135f-45d2-9aca-55513eb0e860","Type":"ContainerDied","Data":"1b5a0ec763bf580aef8e9f8e8d7be9087c396c7cc6f58d1fd6b7ab61ac9d9f28"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.073133 5127 generic.go:334] "Generic (PLEG): container finished" podID="579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2" containerID="00489ed369f9be9fa5ae5086922bd8bea26af0611b892a20c80a5e1db18c8328" exitCode=0 Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.073184 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sflr2" event={"ID":"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2","Type":"ContainerDied","Data":"00489ed369f9be9fa5ae5086922bd8bea26af0611b892a20c80a5e1db18c8328"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.074895 5127 generic.go:334] "Generic (PLEG): container finished" podID="657c0b79-3594-4a70-a7de-6152741e8148" containerID="61e718b5841a9da16e8cc4920a23aae60828b65b752e22e97dd3965261d101ac" exitCode=0 Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.074935 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" event={"ID":"657c0b79-3594-4a70-a7de-6152741e8148","Type":"ContainerDied","Data":"61e718b5841a9da16e8cc4920a23aae60828b65b752e22e97dd3965261d101ac"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.076664 5127 generic.go:334] "Generic (PLEG): container finished" podID="c4c2b589-6308-42bc-8b1e-c2d4f3e210b1" containerID="f98afb71f045a45fd856bbc1a5357077f7ba521a2f87d844ed202ca981f7c708" exitCode=0 Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.076727 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c8jq7" event={"ID":"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1","Type":"ContainerDied","Data":"f98afb71f045a45fd856bbc1a5357077f7ba521a2f87d844ed202ca981f7c708"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.077890 5127 generic.go:334] "Generic (PLEG): container finished" podID="254e67ea-20e8-4960-ae74-c4d1bff0369a" containerID="765723f0588cfac569cb1d2b34aaaf61f0d10551bffade767398b9e84a692d76" exitCode=0 Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.077919 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn8d9" event={"ID":"254e67ea-20e8-4960-ae74-c4d1bff0369a","Type":"ContainerDied","Data":"765723f0588cfac569cb1d2b34aaaf61f0d10551bffade767398b9e84a692d76"} Feb 01 07:09:55 crc kubenswrapper[5127]: I0201 07:09:55.768275 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.103608 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerStarted","Data":"4535a4dc300b87c6d2e095ff64a11512749c0d4e5f0194b79a07de3abd4555ea"} Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.652985 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.731456 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9af14ed-135f-45d2-9aca-55513eb0e860-operator-scripts\") pod \"c9af14ed-135f-45d2-9aca-55513eb0e860\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.731595 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jd4g\" (UniqueName: \"kubernetes.io/projected/c9af14ed-135f-45d2-9aca-55513eb0e860-kube-api-access-4jd4g\") pod \"c9af14ed-135f-45d2-9aca-55513eb0e860\" (UID: \"c9af14ed-135f-45d2-9aca-55513eb0e860\") " Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.732685 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9af14ed-135f-45d2-9aca-55513eb0e860-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c9af14ed-135f-45d2-9aca-55513eb0e860" (UID: "c9af14ed-135f-45d2-9aca-55513eb0e860"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.737810 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9af14ed-135f-45d2-9aca-55513eb0e860-kube-api-access-4jd4g" (OuterVolumeSpecName: "kube-api-access-4jd4g") pod "c9af14ed-135f-45d2-9aca-55513eb0e860" (UID: "c9af14ed-135f-45d2-9aca-55513eb0e860"). InnerVolumeSpecName "kube-api-access-4jd4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.833373 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c9af14ed-135f-45d2-9aca-55513eb0e860-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.833401 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jd4g\" (UniqueName: \"kubernetes.io/projected/c9af14ed-135f-45d2-9aca-55513eb0e860-kube-api-access-4jd4g\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.906297 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.910137 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.914216 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.922367 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:56 crc kubenswrapper[5127]: I0201 07:09:56.923182 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036161 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv6kt\" (UniqueName: \"kubernetes.io/projected/c6a4da05-deae-4395-a91b-b8ddfb804f8a-kube-api-access-zv6kt\") pod \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036252 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a4da05-deae-4395-a91b-b8ddfb804f8a-operator-scripts\") pod \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\" (UID: \"c6a4da05-deae-4395-a91b-b8ddfb804f8a\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036483 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-operator-scripts\") pod \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036507 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdsrn\" (UniqueName: \"kubernetes.io/projected/254e67ea-20e8-4960-ae74-c4d1bff0369a-kube-api-access-zdsrn\") pod \"254e67ea-20e8-4960-ae74-c4d1bff0369a\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036589 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/657c0b79-3594-4a70-a7de-6152741e8148-operator-scripts\") pod \"657c0b79-3594-4a70-a7de-6152741e8148\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036620 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/254e67ea-20e8-4960-ae74-c4d1bff0369a-operator-scripts\") pod \"254e67ea-20e8-4960-ae74-c4d1bff0369a\" (UID: \"254e67ea-20e8-4960-ae74-c4d1bff0369a\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036683 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6mh5\" (UniqueName: \"kubernetes.io/projected/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-kube-api-access-z6mh5\") pod \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\" (UID: \"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036711 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x4tk\" (UniqueName: \"kubernetes.io/projected/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-kube-api-access-2x4tk\") pod \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036733 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flmks\" (UniqueName: \"kubernetes.io/projected/657c0b79-3594-4a70-a7de-6152741e8148-kube-api-access-flmks\") pod \"657c0b79-3594-4a70-a7de-6152741e8148\" (UID: \"657c0b79-3594-4a70-a7de-6152741e8148\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.036790 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-operator-scripts\") pod \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\" (UID: \"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2\") " Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.037520 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2" (UID: "579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.038237 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/657c0b79-3594-4a70-a7de-6152741e8148-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "657c0b79-3594-4a70-a7de-6152741e8148" (UID: "657c0b79-3594-4a70-a7de-6152741e8148"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.038617 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6a4da05-deae-4395-a91b-b8ddfb804f8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6a4da05-deae-4395-a91b-b8ddfb804f8a" (UID: "c6a4da05-deae-4395-a91b-b8ddfb804f8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.039006 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4c2b589-6308-42bc-8b1e-c2d4f3e210b1" (UID: "c4c2b589-6308-42bc-8b1e-c2d4f3e210b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.042094 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/254e67ea-20e8-4960-ae74-c4d1bff0369a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "254e67ea-20e8-4960-ae74-c4d1bff0369a" (UID: "254e67ea-20e8-4960-ae74-c4d1bff0369a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.043081 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a4da05-deae-4395-a91b-b8ddfb804f8a-kube-api-access-zv6kt" (OuterVolumeSpecName: "kube-api-access-zv6kt") pod "c6a4da05-deae-4395-a91b-b8ddfb804f8a" (UID: "c6a4da05-deae-4395-a91b-b8ddfb804f8a"). InnerVolumeSpecName "kube-api-access-zv6kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.051088 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-kube-api-access-z6mh5" (OuterVolumeSpecName: "kube-api-access-z6mh5") pod "c4c2b589-6308-42bc-8b1e-c2d4f3e210b1" (UID: "c4c2b589-6308-42bc-8b1e-c2d4f3e210b1"). InnerVolumeSpecName "kube-api-access-z6mh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.057817 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254e67ea-20e8-4960-ae74-c4d1bff0369a-kube-api-access-zdsrn" (OuterVolumeSpecName: "kube-api-access-zdsrn") pod "254e67ea-20e8-4960-ae74-c4d1bff0369a" (UID: "254e67ea-20e8-4960-ae74-c4d1bff0369a"). InnerVolumeSpecName "kube-api-access-zdsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.057931 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-kube-api-access-2x4tk" (OuterVolumeSpecName: "kube-api-access-2x4tk") pod "579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2" (UID: "579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2"). InnerVolumeSpecName "kube-api-access-2x4tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.058269 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657c0b79-3594-4a70-a7de-6152741e8148-kube-api-access-flmks" (OuterVolumeSpecName: "kube-api-access-flmks") pod "657c0b79-3594-4a70-a7de-6152741e8148" (UID: "657c0b79-3594-4a70-a7de-6152741e8148"). InnerVolumeSpecName "kube-api-access-flmks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.131035 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" event={"ID":"c6a4da05-deae-4395-a91b-b8ddfb804f8a","Type":"ContainerDied","Data":"e98b35686a127e9d9df2313904d81dd9fb53d395135391a2267dd9565d3735a8"} Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.131074 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e98b35686a127e9d9df2313904d81dd9fb53d395135391a2267dd9565d3735a8" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.131130 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9eab-account-create-update-f4jqw" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141533 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv6kt\" (UniqueName: \"kubernetes.io/projected/c6a4da05-deae-4395-a91b-b8ddfb804f8a-kube-api-access-zv6kt\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141563 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a4da05-deae-4395-a91b-b8ddfb804f8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141571 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141592 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdsrn\" (UniqueName: \"kubernetes.io/projected/254e67ea-20e8-4960-ae74-c4d1bff0369a-kube-api-access-zdsrn\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141600 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/657c0b79-3594-4a70-a7de-6152741e8148-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141608 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/254e67ea-20e8-4960-ae74-c4d1bff0369a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141616 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6mh5\" (UniqueName: \"kubernetes.io/projected/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1-kube-api-access-z6mh5\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141626 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x4tk\" (UniqueName: \"kubernetes.io/projected/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-kube-api-access-2x4tk\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141635 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flmks\" (UniqueName: \"kubernetes.io/projected/657c0b79-3594-4a70-a7de-6152741e8148-kube-api-access-flmks\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.141645 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.151459 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" event={"ID":"c9af14ed-135f-45d2-9aca-55513eb0e860","Type":"ContainerDied","Data":"f2d6d238e63acd717ec345e19d07e9963e66293add822c8643e4cd73790d94af"} Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.151496 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2d6d238e63acd717ec345e19d07e9963e66293add822c8643e4cd73790d94af" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.151572 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-w7f6k" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.159319 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sflr2" event={"ID":"579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2","Type":"ContainerDied","Data":"0f548475089f39224d3f8e981cc2cd9baac946d3eed77368c043bbbe0e6f4c54"} Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.159387 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f548475089f39224d3f8e981cc2cd9baac946d3eed77368c043bbbe0e6f4c54" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.159351 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sflr2" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.162309 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" event={"ID":"657c0b79-3594-4a70-a7de-6152741e8148","Type":"ContainerDied","Data":"a0436199ace60b9ec5b144b916c8aea1a30125b3324b1a46672b143b298947be"} Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.162355 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0436199ace60b9ec5b144b916c8aea1a30125b3324b1a46672b143b298947be" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.162430 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-l8j7b" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.164757 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c8jq7" event={"ID":"c4c2b589-6308-42bc-8b1e-c2d4f3e210b1","Type":"ContainerDied","Data":"24b4a04fb3e8c55c12117001e1e106cac1ed88e6210c89b5804f616efd032a25"} Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.164798 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b4a04fb3e8c55c12117001e1e106cac1ed88e6210c89b5804f616efd032a25" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.164854 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c8jq7" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.172796 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nn8d9" event={"ID":"254e67ea-20e8-4960-ae74-c4d1bff0369a","Type":"ContainerDied","Data":"a47db05571f6d695eb9ddb92e74165997f15b61fafd17eef617c01e92aaa91da"} Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.172862 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a47db05571f6d695eb9ddb92e74165997f15b61fafd17eef617c01e92aaa91da" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.172979 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nn8d9" Feb 01 07:09:57 crc kubenswrapper[5127]: I0201 07:09:57.194280 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerStarted","Data":"dbc9e80f7b4bef599da5fd06adaf83f62eed80764be85679eb7339519c42046c"} Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.155311 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cgmxq"] Feb 01 07:09:58 crc kubenswrapper[5127]: E0201 07:09:58.156783 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254e67ea-20e8-4960-ae74-c4d1bff0369a" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.156817 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="254e67ea-20e8-4960-ae74-c4d1bff0369a" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: E0201 07:09:58.156840 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657c0b79-3594-4a70-a7de-6152741e8148" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.156849 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="657c0b79-3594-4a70-a7de-6152741e8148" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: E0201 07:09:58.156864 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a4da05-deae-4395-a91b-b8ddfb804f8a" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.156873 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a4da05-deae-4395-a91b-b8ddfb804f8a" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: E0201 07:09:58.156902 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.156912 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: E0201 07:09:58.156935 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c2b589-6308-42bc-8b1e-c2d4f3e210b1" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.156946 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c2b589-6308-42bc-8b1e-c2d4f3e210b1" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: E0201 07:09:58.156960 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9af14ed-135f-45d2-9aca-55513eb0e860" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.156970 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9af14ed-135f-45d2-9aca-55513eb0e860" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.157205 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9af14ed-135f-45d2-9aca-55513eb0e860" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.157225 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="657c0b79-3594-4a70-a7de-6152741e8148" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.157236 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c2b589-6308-42bc-8b1e-c2d4f3e210b1" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.157260 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.157274 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a4da05-deae-4395-a91b-b8ddfb804f8a" containerName="mariadb-account-create-update" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.157284 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="254e67ea-20e8-4960-ae74-c4d1bff0369a" containerName="mariadb-database-create" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.158509 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.164106 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xdhhh" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.164387 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.164519 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.206425 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cgmxq"] Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.259967 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.260430 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-scripts\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.260465 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4f8l\" (UniqueName: \"kubernetes.io/projected/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-kube-api-access-s4f8l\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.260866 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-config-data\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.365358 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.366395 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-scripts\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.366441 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4f8l\" (UniqueName: \"kubernetes.io/projected/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-kube-api-access-s4f8l\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.366611 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-config-data\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.372038 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.372047 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-scripts\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.372489 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-config-data\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.394602 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4f8l\" (UniqueName: \"kubernetes.io/projected/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-kube-api-access-s4f8l\") pod \"nova-cell0-conductor-db-sync-cgmxq\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.512311 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:09:58 crc kubenswrapper[5127]: I0201 07:09:58.963789 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cgmxq"] Feb 01 07:09:58 crc kubenswrapper[5127]: W0201 07:09:58.968839 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea46d9a_92b4_4bd6_b9b2_bf342ad7b350.slice/crio-778ae7996088e507458f18507ceccde916cada70805e045f6086b3f1eadd9ce0 WatchSource:0}: Error finding container 778ae7996088e507458f18507ceccde916cada70805e045f6086b3f1eadd9ce0: Status 404 returned error can't find the container with id 778ae7996088e507458f18507ceccde916cada70805e045f6086b3f1eadd9ce0 Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.223680 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerStarted","Data":"0701cb2acbafdcc7b4588f5ec553ad67cc4e0db16bc04d288e06984e9e116277"} Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.223892 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-central-agent" containerID="cri-o://d76f7a36806114208117c1ee1d0b06461621b41baf8155010c177a528a50ed1f" gracePeriod=30 Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.224166 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.224494 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="proxy-httpd" containerID="cri-o://0701cb2acbafdcc7b4588f5ec553ad67cc4e0db16bc04d288e06984e9e116277" gracePeriod=30 Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.224547 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="sg-core" containerID="cri-o://dbc9e80f7b4bef599da5fd06adaf83f62eed80764be85679eb7339519c42046c" gracePeriod=30 Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.224617 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-notification-agent" containerID="cri-o://4535a4dc300b87c6d2e095ff64a11512749c0d4e5f0194b79a07de3abd4555ea" gracePeriod=30 Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.231416 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" event={"ID":"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350","Type":"ContainerStarted","Data":"778ae7996088e507458f18507ceccde916cada70805e045f6086b3f1eadd9ce0"} Feb 01 07:09:59 crc kubenswrapper[5127]: I0201 07:09:59.257563 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.177824291 podStartE2EDuration="7.257537125s" podCreationTimestamp="2026-02-01 07:09:52 +0000 UTC" firstStartedPulling="2026-02-01 07:09:54.000806457 +0000 UTC m=+1344.486708820" lastFinishedPulling="2026-02-01 07:09:58.080519281 +0000 UTC m=+1348.566421654" observedRunningTime="2026-02-01 07:09:59.250780701 +0000 UTC m=+1349.736683074" watchObservedRunningTime="2026-02-01 07:09:59.257537125 +0000 UTC m=+1349.743439488" Feb 01 07:09:59 crc kubenswrapper[5127]: E0201 07:09:59.344993 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474c7fb9_fcd9_48aa_9287_d114245f9a63.slice/crio-9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312\": RecentStats: unable to find data in memory cache]" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247537 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerID="0701cb2acbafdcc7b4588f5ec553ad67cc4e0db16bc04d288e06984e9e116277" exitCode=0 Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247825 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerID="dbc9e80f7b4bef599da5fd06adaf83f62eed80764be85679eb7339519c42046c" exitCode=2 Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247835 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerID="4535a4dc300b87c6d2e095ff64a11512749c0d4e5f0194b79a07de3abd4555ea" exitCode=0 Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247842 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerID="d76f7a36806114208117c1ee1d0b06461621b41baf8155010c177a528a50ed1f" exitCode=0 Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247616 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerDied","Data":"0701cb2acbafdcc7b4588f5ec553ad67cc4e0db16bc04d288e06984e9e116277"} Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247869 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerDied","Data":"dbc9e80f7b4bef599da5fd06adaf83f62eed80764be85679eb7339519c42046c"} Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247883 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerDied","Data":"4535a4dc300b87c6d2e095ff64a11512749c0d4e5f0194b79a07de3abd4555ea"} Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.247893 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerDied","Data":"d76f7a36806114208117c1ee1d0b06461621b41baf8155010c177a528a50ed1f"} Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.327750 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.417962 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-scripts\") pod \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.418054 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjsnh\" (UniqueName: \"kubernetes.io/projected/7e5c4c92-911f-4112-853f-1c398c0c0bbc-kube-api-access-fjsnh\") pod \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.418100 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-config-data\") pod \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.418274 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-run-httpd\") pod \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.418316 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-sg-core-conf-yaml\") pod \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.418387 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-combined-ca-bundle\") pod \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.418424 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-log-httpd\") pod \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\" (UID: \"7e5c4c92-911f-4112-853f-1c398c0c0bbc\") " Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.420427 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e5c4c92-911f-4112-853f-1c398c0c0bbc" (UID: "7e5c4c92-911f-4112-853f-1c398c0c0bbc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.421883 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e5c4c92-911f-4112-853f-1c398c0c0bbc" (UID: "7e5c4c92-911f-4112-853f-1c398c0c0bbc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.430401 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-scripts" (OuterVolumeSpecName: "scripts") pod "7e5c4c92-911f-4112-853f-1c398c0c0bbc" (UID: "7e5c4c92-911f-4112-853f-1c398c0c0bbc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.438540 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5c4c92-911f-4112-853f-1c398c0c0bbc-kube-api-access-fjsnh" (OuterVolumeSpecName: "kube-api-access-fjsnh") pod "7e5c4c92-911f-4112-853f-1c398c0c0bbc" (UID: "7e5c4c92-911f-4112-853f-1c398c0c0bbc"). InnerVolumeSpecName "kube-api-access-fjsnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.449858 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e5c4c92-911f-4112-853f-1c398c0c0bbc" (UID: "7e5c4c92-911f-4112-853f-1c398c0c0bbc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.514692 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e5c4c92-911f-4112-853f-1c398c0c0bbc" (UID: "7e5c4c92-911f-4112-853f-1c398c0c0bbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.521359 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.521396 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjsnh\" (UniqueName: \"kubernetes.io/projected/7e5c4c92-911f-4112-853f-1c398c0c0bbc-kube-api-access-fjsnh\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.521408 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.521416 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.521428 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.521435 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e5c4c92-911f-4112-853f-1c398c0c0bbc-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.539341 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-config-data" (OuterVolumeSpecName: "config-data") pod "7e5c4c92-911f-4112-853f-1c398c0c0bbc" (UID: "7e5c4c92-911f-4112-853f-1c398c0c0bbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:00 crc kubenswrapper[5127]: I0201 07:10:00.623273 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e5c4c92-911f-4112-853f-1c398c0c0bbc-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.277362 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e5c4c92-911f-4112-853f-1c398c0c0bbc","Type":"ContainerDied","Data":"5b8c34145583682d76595862846717720fac9940d474d5a5a04b75cf340b9e89"} Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.277414 5127 scope.go:117] "RemoveContainer" containerID="0701cb2acbafdcc7b4588f5ec553ad67cc4e0db16bc04d288e06984e9e116277" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.277551 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.310320 5127 scope.go:117] "RemoveContainer" containerID="dbc9e80f7b4bef599da5fd06adaf83f62eed80764be85679eb7339519c42046c" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.328538 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.333842 5127 scope.go:117] "RemoveContainer" containerID="4535a4dc300b87c6d2e095ff64a11512749c0d4e5f0194b79a07de3abd4555ea" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.342240 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.376750 5127 scope.go:117] "RemoveContainer" containerID="d76f7a36806114208117c1ee1d0b06461621b41baf8155010c177a528a50ed1f" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.377083 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:01 crc kubenswrapper[5127]: E0201 07:10:01.377837 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="proxy-httpd" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.377857 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="proxy-httpd" Feb 01 07:10:01 crc kubenswrapper[5127]: E0201 07:10:01.377869 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-central-agent" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.377876 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-central-agent" Feb 01 07:10:01 crc kubenswrapper[5127]: E0201 07:10:01.377899 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="sg-core" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.377906 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="sg-core" Feb 01 07:10:01 crc kubenswrapper[5127]: E0201 07:10:01.377923 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-notification-agent" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.377930 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-notification-agent" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.378283 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="sg-core" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.378313 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-notification-agent" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.378324 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="ceilometer-central-agent" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.378339 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" containerName="proxy-httpd" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.384122 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.391097 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.391468 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.395830 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.536910 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-log-httpd\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.536960 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-scripts\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.536977 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-run-httpd\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.536998 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.537029 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.537086 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62ld\" (UniqueName: \"kubernetes.io/projected/1a3c5581-7745-4842-80de-ef904d1ae0a1-kube-api-access-q62ld\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.537168 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-config-data\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.638929 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-scripts\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.638964 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-run-httpd\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.638987 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.639029 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.639073 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q62ld\" (UniqueName: \"kubernetes.io/projected/1a3c5581-7745-4842-80de-ef904d1ae0a1-kube-api-access-q62ld\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.639131 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-config-data\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.639181 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-log-httpd\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.639546 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-log-httpd\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.640627 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-run-httpd\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.645416 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.645931 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.646025 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-config-data\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.661639 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62ld\" (UniqueName: \"kubernetes.io/projected/1a3c5581-7745-4842-80de-ef904d1ae0a1-kube-api-access-q62ld\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.661725 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-scripts\") pod \"ceilometer-0\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " pod="openstack/ceilometer-0" Feb 01 07:10:01 crc kubenswrapper[5127]: I0201 07:10:01.702880 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:02 crc kubenswrapper[5127]: I0201 07:10:02.205037 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:02 crc kubenswrapper[5127]: W0201 07:10:02.231004 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3c5581_7745_4842_80de_ef904d1ae0a1.slice/crio-f76efb32fa92b21335134612b4c0295350bc1ff8c58b8d4bc73563a38d7bfe59 WatchSource:0}: Error finding container f76efb32fa92b21335134612b4c0295350bc1ff8c58b8d4bc73563a38d7bfe59: Status 404 returned error can't find the container with id f76efb32fa92b21335134612b4c0295350bc1ff8c58b8d4bc73563a38d7bfe59 Feb 01 07:10:02 crc kubenswrapper[5127]: I0201 07:10:02.253016 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e5c4c92-911f-4112-853f-1c398c0c0bbc" path="/var/lib/kubelet/pods/7e5c4c92-911f-4112-853f-1c398c0c0bbc/volumes" Feb 01 07:10:02 crc kubenswrapper[5127]: I0201 07:10:02.295985 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerStarted","Data":"f76efb32fa92b21335134612b4c0295350bc1ff8c58b8d4bc73563a38d7bfe59"} Feb 01 07:10:02 crc kubenswrapper[5127]: I0201 07:10:02.791867 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.148565 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.150477 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-log" containerID="cri-o://de803807ce8b3fdcbb5bdcaab8ac82b578e091733368f0cadd7095da580a6a90" gracePeriod=30 Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.150571 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-httpd" containerID="cri-o://e429fa501c99bbff52997ad8cb9d22463911edc7607c645c63b1029454eb35b5" gracePeriod=30 Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.370830 5127 generic.go:334] "Generic (PLEG): container finished" podID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerID="de803807ce8b3fdcbb5bdcaab8ac82b578e091733368f0cadd7095da580a6a90" exitCode=143 Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.370905 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4198b002-5dbd-4855-97f1-bae36fd86bf5","Type":"ContainerDied","Data":"de803807ce8b3fdcbb5bdcaab8ac82b578e091733368f0cadd7095da580a6a90"} Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.372373 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" event={"ID":"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350","Type":"ContainerStarted","Data":"627b8c1220460e6b7ed1941227e8f20d5bd889a0bf5657636bc2ea4ab157629d"} Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.374962 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerStarted","Data":"a4d95826f8362f9d7ffd6f3149e5f6934169ad52cc0652033d1cec6510d9262b"} Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.374999 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerStarted","Data":"368ef94e1ee76d7b9e7ae33181f9842f317cc0cead1a295dce0c8acfe734993d"} Feb 01 07:10:09 crc kubenswrapper[5127]: I0201 07:10:09.408132 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" podStartSLOduration=2.0718852070000002 podStartE2EDuration="11.40811184s" podCreationTimestamp="2026-02-01 07:09:58 +0000 UTC" firstStartedPulling="2026-02-01 07:09:58.971238316 +0000 UTC m=+1349.457140679" lastFinishedPulling="2026-02-01 07:10:08.307464939 +0000 UTC m=+1358.793367312" observedRunningTime="2026-02-01 07:10:09.403411223 +0000 UTC m=+1359.889313586" watchObservedRunningTime="2026-02-01 07:10:09.40811184 +0000 UTC m=+1359.894014203" Feb 01 07:10:09 crc kubenswrapper[5127]: E0201 07:10:09.626720 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474c7fb9_fcd9_48aa_9287_d114245f9a63.slice/crio-9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312\": RecentStats: unable to find data in memory cache]" Feb 01 07:10:10 crc kubenswrapper[5127]: I0201 07:10:10.113513 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:10:10 crc kubenswrapper[5127]: I0201 07:10:10.114043 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-log" containerID="cri-o://2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459" gracePeriod=30 Feb 01 07:10:10 crc kubenswrapper[5127]: I0201 07:10:10.114453 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-httpd" containerID="cri-o://5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078" gracePeriod=30 Feb 01 07:10:10 crc kubenswrapper[5127]: I0201 07:10:10.385047 5127 generic.go:334] "Generic (PLEG): container finished" podID="541316fd-1125-4922-8791-4c40a7188768" containerID="2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459" exitCode=143 Feb 01 07:10:10 crc kubenswrapper[5127]: I0201 07:10:10.385128 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"541316fd-1125-4922-8791-4c40a7188768","Type":"ContainerDied","Data":"2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459"} Feb 01 07:10:10 crc kubenswrapper[5127]: I0201 07:10:10.387795 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerStarted","Data":"77884977b73d706e8a967c0ecf628190dd552ba4555d29355e6d3279cc2de534"} Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.417025 5127 generic.go:334] "Generic (PLEG): container finished" podID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerID="e429fa501c99bbff52997ad8cb9d22463911edc7607c645c63b1029454eb35b5" exitCode=0 Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.417102 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4198b002-5dbd-4855-97f1-bae36fd86bf5","Type":"ContainerDied","Data":"e429fa501c99bbff52997ad8cb9d22463911edc7607c645c63b1029454eb35b5"} Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.420987 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerStarted","Data":"de2c0d559145232f892d8a8b321fe4c14996ae3b3723037699e8f37e3b3db088"} Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.421133 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-central-agent" containerID="cri-o://368ef94e1ee76d7b9e7ae33181f9842f317cc0cead1a295dce0c8acfe734993d" gracePeriod=30 Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.421344 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.421561 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="proxy-httpd" containerID="cri-o://de2c0d559145232f892d8a8b321fe4c14996ae3b3723037699e8f37e3b3db088" gracePeriod=30 Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.421633 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="sg-core" containerID="cri-o://77884977b73d706e8a967c0ecf628190dd552ba4555d29355e6d3279cc2de534" gracePeriod=30 Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.421662 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-notification-agent" containerID="cri-o://a4d95826f8362f9d7ffd6f3149e5f6934169ad52cc0652033d1cec6510d9262b" gracePeriod=30 Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.447500 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9352548889999999 podStartE2EDuration="11.447479417s" podCreationTimestamp="2026-02-01 07:10:01 +0000 UTC" firstStartedPulling="2026-02-01 07:10:02.238988352 +0000 UTC m=+1352.724890715" lastFinishedPulling="2026-02-01 07:10:11.75121288 +0000 UTC m=+1362.237115243" observedRunningTime="2026-02-01 07:10:12.441327455 +0000 UTC m=+1362.927229828" watchObservedRunningTime="2026-02-01 07:10:12.447479417 +0000 UTC m=+1362.933381780" Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.817268 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976098 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-config-data\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976163 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-httpd-run\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976283 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcwds\" (UniqueName: \"kubernetes.io/projected/4198b002-5dbd-4855-97f1-bae36fd86bf5-kube-api-access-wcwds\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976333 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-public-tls-certs\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976382 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976449 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-scripts\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976500 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-combined-ca-bundle\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.976538 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-logs\") pod \"4198b002-5dbd-4855-97f1-bae36fd86bf5\" (UID: \"4198b002-5dbd-4855-97f1-bae36fd86bf5\") " Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.977122 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.977192 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-logs" (OuterVolumeSpecName: "logs") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.986726 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-scripts" (OuterVolumeSpecName: "scripts") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:12 crc kubenswrapper[5127]: I0201 07:10:12.989346 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4198b002-5dbd-4855-97f1-bae36fd86bf5-kube-api-access-wcwds" (OuterVolumeSpecName: "kube-api-access-wcwds") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "kube-api-access-wcwds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.000198 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.038704 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.046841 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-config-data" (OuterVolumeSpecName: "config-data") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.077671 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4198b002-5dbd-4855-97f1-bae36fd86bf5" (UID: "4198b002-5dbd-4855-97f1-bae36fd86bf5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078377 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcwds\" (UniqueName: \"kubernetes.io/projected/4198b002-5dbd-4855-97f1-bae36fd86bf5-kube-api-access-wcwds\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078399 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078440 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078451 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078461 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078470 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078478 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4198b002-5dbd-4855-97f1-bae36fd86bf5-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.078500 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4198b002-5dbd-4855-97f1-bae36fd86bf5-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.100608 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.180499 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.432020 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.432011 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4198b002-5dbd-4855-97f1-bae36fd86bf5","Type":"ContainerDied","Data":"1ee9603ae51259dfae420ca57c3ed1fd060d26a029054ae3aac803505eda39b4"} Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.433231 5127 scope.go:117] "RemoveContainer" containerID="e429fa501c99bbff52997ad8cb9d22463911edc7607c645c63b1029454eb35b5" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.437690 5127 generic.go:334] "Generic (PLEG): container finished" podID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerID="de2c0d559145232f892d8a8b321fe4c14996ae3b3723037699e8f37e3b3db088" exitCode=0 Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.437725 5127 generic.go:334] "Generic (PLEG): container finished" podID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerID="77884977b73d706e8a967c0ecf628190dd552ba4555d29355e6d3279cc2de534" exitCode=2 Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.437734 5127 generic.go:334] "Generic (PLEG): container finished" podID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerID="a4d95826f8362f9d7ffd6f3149e5f6934169ad52cc0652033d1cec6510d9262b" exitCode=0 Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.437757 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerDied","Data":"de2c0d559145232f892d8a8b321fe4c14996ae3b3723037699e8f37e3b3db088"} Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.437784 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerDied","Data":"77884977b73d706e8a967c0ecf628190dd552ba4555d29355e6d3279cc2de534"} Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.437797 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerDied","Data":"a4d95826f8362f9d7ffd6f3149e5f6934169ad52cc0652033d1cec6510d9262b"} Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.456779 5127 scope.go:117] "RemoveContainer" containerID="de803807ce8b3fdcbb5bdcaab8ac82b578e091733368f0cadd7095da580a6a90" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.487739 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.503810 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.514609 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:10:13 crc kubenswrapper[5127]: E0201 07:10:13.515087 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-httpd" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.515110 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-httpd" Feb 01 07:10:13 crc kubenswrapper[5127]: E0201 07:10:13.515123 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-log" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.515131 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-log" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.515336 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-log" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.515361 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" containerName="glance-httpd" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.516529 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.521354 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.522452 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.522710 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689323 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689373 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689412 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689461 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-logs\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689489 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn288\" (UniqueName: \"kubernetes.io/projected/4d6754e0-125e-446b-8ef2-fc58883f6c76-kube-api-access-sn288\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689515 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689555 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.689602 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791095 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791171 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791195 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791229 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791259 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-logs\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791283 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn288\" (UniqueName: \"kubernetes.io/projected/4d6754e0-125e-446b-8ef2-fc58883f6c76-kube-api-access-sn288\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791346 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.791859 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.792394 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-logs\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.792548 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.796554 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.798615 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-scripts\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.811267 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn288\" (UniqueName: \"kubernetes.io/projected/4d6754e0-125e-446b-8ef2-fc58883f6c76-kube-api-access-sn288\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.811775 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-config-data\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.812777 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:13 crc kubenswrapper[5127]: I0201 07:10:13.854817 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " pod="openstack/glance-default-external-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.135898 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.252733 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4198b002-5dbd-4855-97f1-bae36fd86bf5" path="/var/lib/kubelet/pods/4198b002-5dbd-4855-97f1-bae36fd86bf5/volumes" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.306988 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316111 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-scripts\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316379 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-internal-tls-certs\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316471 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6hq4\" (UniqueName: \"kubernetes.io/projected/541316fd-1125-4922-8791-4c40a7188768-kube-api-access-r6hq4\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316515 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-combined-ca-bundle\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316533 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-logs\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316562 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-config-data\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316595 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.316615 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-httpd-run\") pod \"541316fd-1125-4922-8791-4c40a7188768\" (UID: \"541316fd-1125-4922-8791-4c40a7188768\") " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.318978 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-logs" (OuterVolumeSpecName: "logs") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.319112 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.321471 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-scripts" (OuterVolumeSpecName: "scripts") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.323861 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.330128 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541316fd-1125-4922-8791-4c40a7188768-kube-api-access-r6hq4" (OuterVolumeSpecName: "kube-api-access-r6hq4") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "kube-api-access-r6hq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.381795 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.396747 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-config-data" (OuterVolumeSpecName: "config-data") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.413284 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "541316fd-1125-4922-8791-4c40a7188768" (UID: "541316fd-1125-4922-8791-4c40a7188768"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.420905 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.420965 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.420977 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.421028 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.421040 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/541316fd-1125-4922-8791-4c40a7188768-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.421049 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.421057 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/541316fd-1125-4922-8791-4c40a7188768-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.421065 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6hq4\" (UniqueName: \"kubernetes.io/projected/541316fd-1125-4922-8791-4c40a7188768-kube-api-access-r6hq4\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.440009 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.451690 5127 generic.go:334] "Generic (PLEG): container finished" podID="541316fd-1125-4922-8791-4c40a7188768" containerID="5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078" exitCode=0 Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.451729 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"541316fd-1125-4922-8791-4c40a7188768","Type":"ContainerDied","Data":"5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078"} Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.451751 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.451781 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"541316fd-1125-4922-8791-4c40a7188768","Type":"ContainerDied","Data":"bdd701ae3852e570566e8b7bb18ddcc50af81bbce0d6dc305a27bfdc29d892ad"} Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.451813 5127 scope.go:117] "RemoveContainer" containerID="5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.481232 5127 scope.go:117] "RemoveContainer" containerID="2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.504802 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.513461 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.515914 5127 scope.go:117] "RemoveContainer" containerID="5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.521989 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:14 crc kubenswrapper[5127]: E0201 07:10:14.527806 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078\": container with ID starting with 5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078 not found: ID does not exist" containerID="5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.527853 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078"} err="failed to get container status \"5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078\": rpc error: code = NotFound desc = could not find container \"5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078\": container with ID starting with 5e45ff67d197136bc5d7675cc049ae2c8be3be0e2a2f431c0662e89f92e01078 not found: ID does not exist" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.527886 5127 scope.go:117] "RemoveContainer" containerID="2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459" Feb 01 07:10:14 crc kubenswrapper[5127]: E0201 07:10:14.531992 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459\": container with ID starting with 2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459 not found: ID does not exist" containerID="2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.532029 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459"} err="failed to get container status \"2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459\": rpc error: code = NotFound desc = could not find container \"2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459\": container with ID starting with 2cb2b0a614b1390b8d107dad1e2960707fb49ab72e8df3f5f148b896fc275459 not found: ID does not exist" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.537017 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:10:14 crc kubenswrapper[5127]: E0201 07:10:14.537515 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-httpd" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.537538 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-httpd" Feb 01 07:10:14 crc kubenswrapper[5127]: E0201 07:10:14.537553 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-log" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.537561 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-log" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.537817 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-log" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.537843 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="541316fd-1125-4922-8791-4c40a7188768" containerName="glance-httpd" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.538850 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.541041 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.541238 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.560682 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.740941 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.754920 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lflpb\" (UniqueName: \"kubernetes.io/projected/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-kube-api-access-lflpb\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.754975 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.755000 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.755020 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.755038 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.755067 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.755101 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.755116 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.856940 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lflpb\" (UniqueName: \"kubernetes.io/projected/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-kube-api-access-lflpb\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.857182 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.857263 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.857332 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.857400 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.857475 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.857590 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.857667 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.858132 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.859002 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-logs\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.859251 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.863573 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.864023 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.865364 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.869725 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.875035 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lflpb\" (UniqueName: \"kubernetes.io/projected/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-kube-api-access-lflpb\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:14 crc kubenswrapper[5127]: I0201 07:10:14.901008 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " pod="openstack/glance-default-internal-api-0" Feb 01 07:10:15 crc kubenswrapper[5127]: I0201 07:10:15.166752 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:15 crc kubenswrapper[5127]: I0201 07:10:15.472426 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d6754e0-125e-446b-8ef2-fc58883f6c76","Type":"ContainerStarted","Data":"9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf"} Feb 01 07:10:15 crc kubenswrapper[5127]: I0201 07:10:15.472723 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d6754e0-125e-446b-8ef2-fc58883f6c76","Type":"ContainerStarted","Data":"198654e0dfa52a31b4265533d3dddf843d18338e387b896924c598a1c48f6e8e"} Feb 01 07:10:15 crc kubenswrapper[5127]: I0201 07:10:15.708549 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:10:15 crc kubenswrapper[5127]: W0201 07:10:15.718030 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9187249_9aa3_4b9e_a7db_47d95e5c4f6d.slice/crio-c3bab97ff69b9f5dfb89d459d2cda1d227b4e8b94b7a20cce9ba0a3bb9e3b52d WatchSource:0}: Error finding container c3bab97ff69b9f5dfb89d459d2cda1d227b4e8b94b7a20cce9ba0a3bb9e3b52d: Status 404 returned error can't find the container with id c3bab97ff69b9f5dfb89d459d2cda1d227b4e8b94b7a20cce9ba0a3bb9e3b52d Feb 01 07:10:16 crc kubenswrapper[5127]: I0201 07:10:16.251080 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541316fd-1125-4922-8791-4c40a7188768" path="/var/lib/kubelet/pods/541316fd-1125-4922-8791-4c40a7188768/volumes" Feb 01 07:10:16 crc kubenswrapper[5127]: I0201 07:10:16.485692 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d","Type":"ContainerStarted","Data":"6c3858c14ef4c311b1deda9d45684f86e030100946c594b504545c60e4d6512d"} Feb 01 07:10:16 crc kubenswrapper[5127]: I0201 07:10:16.486053 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d","Type":"ContainerStarted","Data":"c3bab97ff69b9f5dfb89d459d2cda1d227b4e8b94b7a20cce9ba0a3bb9e3b52d"} Feb 01 07:10:16 crc kubenswrapper[5127]: I0201 07:10:16.487626 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d6754e0-125e-446b-8ef2-fc58883f6c76","Type":"ContainerStarted","Data":"fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183"} Feb 01 07:10:16 crc kubenswrapper[5127]: I0201 07:10:16.521305 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.521282662 podStartE2EDuration="3.521282662s" podCreationTimestamp="2026-02-01 07:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:10:16.514565395 +0000 UTC m=+1367.000467768" watchObservedRunningTime="2026-02-01 07:10:16.521282662 +0000 UTC m=+1367.007185025" Feb 01 07:10:17 crc kubenswrapper[5127]: I0201 07:10:17.500009 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d","Type":"ContainerStarted","Data":"bb0ddfad39e508e52c1255ed282f7f8d3226087a328247cb63a34d4e3ddca978"} Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.511239 5127 generic.go:334] "Generic (PLEG): container finished" podID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerID="368ef94e1ee76d7b9e7ae33181f9842f317cc0cead1a295dce0c8acfe734993d" exitCode=0 Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.511325 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerDied","Data":"368ef94e1ee76d7b9e7ae33181f9842f317cc0cead1a295dce0c8acfe734993d"} Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.747116 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.771526 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.771506926 podStartE2EDuration="4.771506926s" podCreationTimestamp="2026-02-01 07:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:10:17.519872935 +0000 UTC m=+1368.005775308" watchObservedRunningTime="2026-02-01 07:10:18.771506926 +0000 UTC m=+1369.257409299" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850371 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q62ld\" (UniqueName: \"kubernetes.io/projected/1a3c5581-7745-4842-80de-ef904d1ae0a1-kube-api-access-q62ld\") pod \"1a3c5581-7745-4842-80de-ef904d1ae0a1\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850474 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-run-httpd\") pod \"1a3c5581-7745-4842-80de-ef904d1ae0a1\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850540 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-log-httpd\") pod \"1a3c5581-7745-4842-80de-ef904d1ae0a1\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850608 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-config-data\") pod \"1a3c5581-7745-4842-80de-ef904d1ae0a1\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850672 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-sg-core-conf-yaml\") pod \"1a3c5581-7745-4842-80de-ef904d1ae0a1\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850710 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-combined-ca-bundle\") pod \"1a3c5581-7745-4842-80de-ef904d1ae0a1\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850753 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-scripts\") pod \"1a3c5581-7745-4842-80de-ef904d1ae0a1\" (UID: \"1a3c5581-7745-4842-80de-ef904d1ae0a1\") " Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.850952 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a3c5581-7745-4842-80de-ef904d1ae0a1" (UID: "1a3c5581-7745-4842-80de-ef904d1ae0a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.851046 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a3c5581-7745-4842-80de-ef904d1ae0a1" (UID: "1a3c5581-7745-4842-80de-ef904d1ae0a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.851702 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.851766 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a3c5581-7745-4842-80de-ef904d1ae0a1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.856510 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-scripts" (OuterVolumeSpecName: "scripts") pod "1a3c5581-7745-4842-80de-ef904d1ae0a1" (UID: "1a3c5581-7745-4842-80de-ef904d1ae0a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.856549 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3c5581-7745-4842-80de-ef904d1ae0a1-kube-api-access-q62ld" (OuterVolumeSpecName: "kube-api-access-q62ld") pod "1a3c5581-7745-4842-80de-ef904d1ae0a1" (UID: "1a3c5581-7745-4842-80de-ef904d1ae0a1"). InnerVolumeSpecName "kube-api-access-q62ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.881281 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a3c5581-7745-4842-80de-ef904d1ae0a1" (UID: "1a3c5581-7745-4842-80de-ef904d1ae0a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.938461 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a3c5581-7745-4842-80de-ef904d1ae0a1" (UID: "1a3c5581-7745-4842-80de-ef904d1ae0a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.953507 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q62ld\" (UniqueName: \"kubernetes.io/projected/1a3c5581-7745-4842-80de-ef904d1ae0a1-kube-api-access-q62ld\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.953553 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.953572 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.953614 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:18 crc kubenswrapper[5127]: I0201 07:10:18.986781 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-config-data" (OuterVolumeSpecName: "config-data") pod "1a3c5581-7745-4842-80de-ef904d1ae0a1" (UID: "1a3c5581-7745-4842-80de-ef904d1ae0a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.055763 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a3c5581-7745-4842-80de-ef904d1ae0a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.525745 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a3c5581-7745-4842-80de-ef904d1ae0a1","Type":"ContainerDied","Data":"f76efb32fa92b21335134612b4c0295350bc1ff8c58b8d4bc73563a38d7bfe59"} Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.525835 5127 scope.go:117] "RemoveContainer" containerID="de2c0d559145232f892d8a8b321fe4c14996ae3b3723037699e8f37e3b3db088" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.525863 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.565925 5127 scope.go:117] "RemoveContainer" containerID="77884977b73d706e8a967c0ecf628190dd552ba4555d29355e6d3279cc2de534" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.567912 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.577245 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.608575 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:19 crc kubenswrapper[5127]: E0201 07:10:19.609418 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-central-agent" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609453 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-central-agent" Feb 01 07:10:19 crc kubenswrapper[5127]: E0201 07:10:19.609493 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="proxy-httpd" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609505 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="proxy-httpd" Feb 01 07:10:19 crc kubenswrapper[5127]: E0201 07:10:19.609533 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="sg-core" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609543 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="sg-core" Feb 01 07:10:19 crc kubenswrapper[5127]: E0201 07:10:19.609559 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-notification-agent" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609568 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-notification-agent" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609850 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-notification-agent" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609896 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="proxy-httpd" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609913 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="sg-core" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.609944 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" containerName="ceilometer-central-agent" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.612199 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.613499 5127 scope.go:117] "RemoveContainer" containerID="a4d95826f8362f9d7ffd6f3149e5f6934169ad52cc0652033d1cec6510d9262b" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.615814 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.616163 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.651698 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.655370 5127 scope.go:117] "RemoveContainer" containerID="368ef94e1ee76d7b9e7ae33181f9842f317cc0cead1a295dce0c8acfe734993d" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.767327 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.767379 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-log-httpd\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.767443 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.767558 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx2nc\" (UniqueName: \"kubernetes.io/projected/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-kube-api-access-rx2nc\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.767673 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-scripts\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.767718 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-config-data\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.767739 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-run-httpd\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.869669 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx2nc\" (UniqueName: \"kubernetes.io/projected/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-kube-api-access-rx2nc\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.870095 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-scripts\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.871557 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-config-data\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.871605 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-run-httpd\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: E0201 07:10:19.870175 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474c7fb9_fcd9_48aa_9287_d114245f9a63.slice/crio-9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312\": RecentStats: unable to find data in memory cache]" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.871745 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.871782 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-log-httpd\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.871864 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.872456 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-run-httpd\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.879099 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-config-data\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.879457 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-log-httpd\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.879847 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.880507 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-scripts\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.887183 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.888719 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx2nc\" (UniqueName: \"kubernetes.io/projected/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-kube-api-access-rx2nc\") pod \"ceilometer-0\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " pod="openstack/ceilometer-0" Feb 01 07:10:19 crc kubenswrapper[5127]: I0201 07:10:19.931867 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:20 crc kubenswrapper[5127]: I0201 07:10:20.246237 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3c5581-7745-4842-80de-ef904d1ae0a1" path="/var/lib/kubelet/pods/1a3c5581-7745-4842-80de-ef904d1ae0a1/volumes" Feb 01 07:10:20 crc kubenswrapper[5127]: W0201 07:10:20.420539 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb755fb4b_1983_4bfb_91eb_f3f7d00764b4.slice/crio-f15c508846ada0215f6f6712d8c21e16d63f6680ef8fa9d1a5a10a1035474d94 WatchSource:0}: Error finding container f15c508846ada0215f6f6712d8c21e16d63f6680ef8fa9d1a5a10a1035474d94: Status 404 returned error can't find the container with id f15c508846ada0215f6f6712d8c21e16d63f6680ef8fa9d1a5a10a1035474d94 Feb 01 07:10:20 crc kubenswrapper[5127]: I0201 07:10:20.427007 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:20 crc kubenswrapper[5127]: I0201 07:10:20.535515 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerStarted","Data":"f15c508846ada0215f6f6712d8c21e16d63f6680ef8fa9d1a5a10a1035474d94"} Feb 01 07:10:20 crc kubenswrapper[5127]: I0201 07:10:20.537099 5127 generic.go:334] "Generic (PLEG): container finished" podID="aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" containerID="627b8c1220460e6b7ed1941227e8f20d5bd889a0bf5657636bc2ea4ab157629d" exitCode=0 Feb 01 07:10:20 crc kubenswrapper[5127]: I0201 07:10:20.537156 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" event={"ID":"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350","Type":"ContainerDied","Data":"627b8c1220460e6b7ed1941227e8f20d5bd889a0bf5657636bc2ea4ab157629d"} Feb 01 07:10:21 crc kubenswrapper[5127]: I0201 07:10:21.562092 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerStarted","Data":"16a0f5183c272650e62d3aa833e2adae7abab06f67af3856a2ff14ced66eb62e"} Feb 01 07:10:21 crc kubenswrapper[5127]: I0201 07:10:21.963261 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.028406 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-combined-ca-bundle\") pod \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.028452 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-config-data\") pod \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.028689 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4f8l\" (UniqueName: \"kubernetes.io/projected/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-kube-api-access-s4f8l\") pod \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.028730 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-scripts\") pod \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\" (UID: \"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350\") " Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.033838 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-scripts" (OuterVolumeSpecName: "scripts") pod "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" (UID: "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.034487 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-kube-api-access-s4f8l" (OuterVolumeSpecName: "kube-api-access-s4f8l") pod "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" (UID: "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350"). InnerVolumeSpecName "kube-api-access-s4f8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.054850 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" (UID: "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.061299 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-config-data" (OuterVolumeSpecName: "config-data") pod "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" (UID: "aea46d9a-92b4-4bd6-b9b2-bf342ad7b350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.131359 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4f8l\" (UniqueName: \"kubernetes.io/projected/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-kube-api-access-s4f8l\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.131386 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.131395 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.131404 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.571661 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.571693 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cgmxq" event={"ID":"aea46d9a-92b4-4bd6-b9b2-bf342ad7b350","Type":"ContainerDied","Data":"778ae7996088e507458f18507ceccde916cada70805e045f6086b3f1eadd9ce0"} Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.572151 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="778ae7996088e507458f18507ceccde916cada70805e045f6086b3f1eadd9ce0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.573838 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerStarted","Data":"878193ba2b117e6c901c5a95211cef22b29eb9ebc76f02d101d3c29b3ea2e806"} Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.573868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerStarted","Data":"6d14755a29add2d0e9f7d999df1bf53d0403286b5a8ecbe46e996bc4fc2bb62c"} Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.669030 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:22 crc kubenswrapper[5127]: E0201 07:10:22.669752 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" containerName="nova-cell0-conductor-db-sync" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.669785 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" containerName="nova-cell0-conductor-db-sync" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.670143 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" containerName="nova-cell0-conductor-db-sync" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.671203 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.673943 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.680453 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xdhhh" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.684211 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.743410 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.743500 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.743550 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcm8l\" (UniqueName: \"kubernetes.io/projected/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-kube-api-access-rcm8l\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.845035 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.845109 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.845148 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcm8l\" (UniqueName: \"kubernetes.io/projected/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-kube-api-access-rcm8l\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.849700 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.856413 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.874149 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcm8l\" (UniqueName: \"kubernetes.io/projected/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-kube-api-access-rcm8l\") pod \"nova-cell0-conductor-0\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:22 crc kubenswrapper[5127]: I0201 07:10:22.986852 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:23 crc kubenswrapper[5127]: W0201 07:10:23.452535 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814a0b0d_2a0b_49fa_9477_65d1056b2c3e.slice/crio-f77aa53baaec039802ebfea1d73e469c4ef25eafb0fe722622e73b7b3456421a WatchSource:0}: Error finding container f77aa53baaec039802ebfea1d73e469c4ef25eafb0fe722622e73b7b3456421a: Status 404 returned error can't find the container with id f77aa53baaec039802ebfea1d73e469c4ef25eafb0fe722622e73b7b3456421a Feb 01 07:10:23 crc kubenswrapper[5127]: I0201 07:10:23.455877 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:23 crc kubenswrapper[5127]: I0201 07:10:23.583932 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"814a0b0d-2a0b-49fa-9477-65d1056b2c3e","Type":"ContainerStarted","Data":"f77aa53baaec039802ebfea1d73e469c4ef25eafb0fe722622e73b7b3456421a"} Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.136342 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.136391 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.179717 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.194181 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.511478 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.592621 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"814a0b0d-2a0b-49fa-9477-65d1056b2c3e","Type":"ContainerStarted","Data":"4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba"} Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.592748 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.595744 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerStarted","Data":"bfaf0f74295546fcd3bacd42aa6cfcf7bf9202a9e26a25b450b190881e732644"} Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.595991 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.596012 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.596039 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.609082 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.6090684360000003 podStartE2EDuration="2.609068436s" podCreationTimestamp="2026-02-01 07:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:10:24.605773239 +0000 UTC m=+1375.091675602" watchObservedRunningTime="2026-02-01 07:10:24.609068436 +0000 UTC m=+1375.094970799" Feb 01 07:10:24 crc kubenswrapper[5127]: I0201 07:10:24.624796 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9767583850000001 podStartE2EDuration="5.62477341s" podCreationTimestamp="2026-02-01 07:10:19 +0000 UTC" firstStartedPulling="2026-02-01 07:10:20.422930471 +0000 UTC m=+1370.908832834" lastFinishedPulling="2026-02-01 07:10:24.070945476 +0000 UTC m=+1374.556847859" observedRunningTime="2026-02-01 07:10:24.622600543 +0000 UTC m=+1375.108502906" watchObservedRunningTime="2026-02-01 07:10:24.62477341 +0000 UTC m=+1375.110675773" Feb 01 07:10:25 crc kubenswrapper[5127]: I0201 07:10:25.167054 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:25 crc kubenswrapper[5127]: I0201 07:10:25.167101 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:25 crc kubenswrapper[5127]: I0201 07:10:25.212654 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:25 crc kubenswrapper[5127]: I0201 07:10:25.212964 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:25 crc kubenswrapper[5127]: I0201 07:10:25.603743 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" containerID="cri-o://4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" gracePeriod=30 Feb 01 07:10:25 crc kubenswrapper[5127]: I0201 07:10:25.604284 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:25 crc kubenswrapper[5127]: I0201 07:10:25.604337 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.413651 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.589821 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.614474 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-central-agent" containerID="cri-o://16a0f5183c272650e62d3aa833e2adae7abab06f67af3856a2ff14ced66eb62e" gracePeriod=30 Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.614513 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="proxy-httpd" containerID="cri-o://bfaf0f74295546fcd3bacd42aa6cfcf7bf9202a9e26a25b450b190881e732644" gracePeriod=30 Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.614554 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="sg-core" containerID="cri-o://878193ba2b117e6c901c5a95211cef22b29eb9ebc76f02d101d3c29b3ea2e806" gracePeriod=30 Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.614652 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-notification-agent" containerID="cri-o://6d14755a29add2d0e9f7d999df1bf53d0403286b5a8ecbe46e996bc4fc2bb62c" gracePeriod=30 Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.614761 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 07:10:26 crc kubenswrapper[5127]: I0201 07:10:26.645270 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.629954 5127 generic.go:334] "Generic (PLEG): container finished" podID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerID="bfaf0f74295546fcd3bacd42aa6cfcf7bf9202a9e26a25b450b190881e732644" exitCode=0 Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.629990 5127 generic.go:334] "Generic (PLEG): container finished" podID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerID="878193ba2b117e6c901c5a95211cef22b29eb9ebc76f02d101d3c29b3ea2e806" exitCode=2 Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.630001 5127 generic.go:334] "Generic (PLEG): container finished" podID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerID="6d14755a29add2d0e9f7d999df1bf53d0403286b5a8ecbe46e996bc4fc2bb62c" exitCode=0 Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.630010 5127 generic.go:334] "Generic (PLEG): container finished" podID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerID="16a0f5183c272650e62d3aa833e2adae7abab06f67af3856a2ff14ced66eb62e" exitCode=0 Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.630866 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerDied","Data":"bfaf0f74295546fcd3bacd42aa6cfcf7bf9202a9e26a25b450b190881e732644"} Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.630897 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerDied","Data":"878193ba2b117e6c901c5a95211cef22b29eb9ebc76f02d101d3c29b3ea2e806"} Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.630908 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerDied","Data":"6d14755a29add2d0e9f7d999df1bf53d0403286b5a8ecbe46e996bc4fc2bb62c"} Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.630916 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerDied","Data":"16a0f5183c272650e62d3aa833e2adae7abab06f67af3856a2ff14ced66eb62e"} Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.770294 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.770450 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.772601 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 07:10:27 crc kubenswrapper[5127]: I0201 07:10:27.943945 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.062121 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-config-data\") pod \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.062178 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-sg-core-conf-yaml\") pod \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.062224 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-log-httpd\") pod \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.063155 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx2nc\" (UniqueName: \"kubernetes.io/projected/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-kube-api-access-rx2nc\") pod \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.063240 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-combined-ca-bundle\") pod \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.063260 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-run-httpd\") pod \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.063326 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-scripts\") pod \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\" (UID: \"b755fb4b-1983-4bfb-91eb-f3f7d00764b4\") " Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.064526 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b755fb4b-1983-4bfb-91eb-f3f7d00764b4" (UID: "b755fb4b-1983-4bfb-91eb-f3f7d00764b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.064699 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b755fb4b-1983-4bfb-91eb-f3f7d00764b4" (UID: "b755fb4b-1983-4bfb-91eb-f3f7d00764b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.068176 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-scripts" (OuterVolumeSpecName: "scripts") pod "b755fb4b-1983-4bfb-91eb-f3f7d00764b4" (UID: "b755fb4b-1983-4bfb-91eb-f3f7d00764b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.071692 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-kube-api-access-rx2nc" (OuterVolumeSpecName: "kube-api-access-rx2nc") pod "b755fb4b-1983-4bfb-91eb-f3f7d00764b4" (UID: "b755fb4b-1983-4bfb-91eb-f3f7d00764b4"). InnerVolumeSpecName "kube-api-access-rx2nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.109070 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b755fb4b-1983-4bfb-91eb-f3f7d00764b4" (UID: "b755fb4b-1983-4bfb-91eb-f3f7d00764b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.128097 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b755fb4b-1983-4bfb-91eb-f3f7d00764b4" (UID: "b755fb4b-1983-4bfb-91eb-f3f7d00764b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.156372 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-config-data" (OuterVolumeSpecName: "config-data") pod "b755fb4b-1983-4bfb-91eb-f3f7d00764b4" (UID: "b755fb4b-1983-4bfb-91eb-f3f7d00764b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.165316 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.165354 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.165367 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx2nc\" (UniqueName: \"kubernetes.io/projected/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-kube-api-access-rx2nc\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.165378 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.165385 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.165392 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.165401 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b755fb4b-1983-4bfb-91eb-f3f7d00764b4-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.641789 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.641790 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b755fb4b-1983-4bfb-91eb-f3f7d00764b4","Type":"ContainerDied","Data":"f15c508846ada0215f6f6712d8c21e16d63f6680ef8fa9d1a5a10a1035474d94"} Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.641859 5127 scope.go:117] "RemoveContainer" containerID="bfaf0f74295546fcd3bacd42aa6cfcf7bf9202a9e26a25b450b190881e732644" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.668192 5127 scope.go:117] "RemoveContainer" containerID="878193ba2b117e6c901c5a95211cef22b29eb9ebc76f02d101d3c29b3ea2e806" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.675654 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.688782 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.696671 5127 scope.go:117] "RemoveContainer" containerID="6d14755a29add2d0e9f7d999df1bf53d0403286b5a8ecbe46e996bc4fc2bb62c" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718050 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:28 crc kubenswrapper[5127]: E0201 07:10:28.718385 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-notification-agent" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718396 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-notification-agent" Feb 01 07:10:28 crc kubenswrapper[5127]: E0201 07:10:28.718406 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="proxy-httpd" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718412 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="proxy-httpd" Feb 01 07:10:28 crc kubenswrapper[5127]: E0201 07:10:28.718424 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="sg-core" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718429 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="sg-core" Feb 01 07:10:28 crc kubenswrapper[5127]: E0201 07:10:28.718443 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-central-agent" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718449 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-central-agent" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718637 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="sg-core" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718646 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="proxy-httpd" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718658 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-central-agent" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.718786 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" containerName="ceilometer-notification-agent" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.720819 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.728476 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.729452 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.749105 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.749257 5127 scope.go:117] "RemoveContainer" containerID="16a0f5183c272650e62d3aa833e2adae7abab06f67af3856a2ff14ced66eb62e" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.880982 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-config-data\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.881028 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qct\" (UniqueName: \"kubernetes.io/projected/9f5a66ae-021d-43bb-ba2c-74aa344c3334-kube-api-access-45qct\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.881050 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.881093 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-scripts\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.881113 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-run-httpd\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.881254 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.881409 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-log-httpd\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.982529 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.982615 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-log-httpd\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.982684 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-config-data\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.982706 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45qct\" (UniqueName: \"kubernetes.io/projected/9f5a66ae-021d-43bb-ba2c-74aa344c3334-kube-api-access-45qct\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.982726 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.982754 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-scripts\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.982774 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-run-httpd\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.983134 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-run-httpd\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.983736 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-log-httpd\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.986273 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.986808 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-scripts\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.990127 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:28 crc kubenswrapper[5127]: I0201 07:10:28.990184 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-config-data\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:29 crc kubenswrapper[5127]: I0201 07:10:29.004568 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qct\" (UniqueName: \"kubernetes.io/projected/9f5a66ae-021d-43bb-ba2c-74aa344c3334-kube-api-access-45qct\") pod \"ceilometer-0\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " pod="openstack/ceilometer-0" Feb 01 07:10:29 crc kubenswrapper[5127]: I0201 07:10:29.050052 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:10:29 crc kubenswrapper[5127]: W0201 07:10:29.501715 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f5a66ae_021d_43bb_ba2c_74aa344c3334.slice/crio-160d07426a5f2cadeafc6d984720608ba07ca9bda2f032d8529ea32348ed52ca WatchSource:0}: Error finding container 160d07426a5f2cadeafc6d984720608ba07ca9bda2f032d8529ea32348ed52ca: Status 404 returned error can't find the container with id 160d07426a5f2cadeafc6d984720608ba07ca9bda2f032d8529ea32348ed52ca Feb 01 07:10:29 crc kubenswrapper[5127]: I0201 07:10:29.505552 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:10:29 crc kubenswrapper[5127]: I0201 07:10:29.650520 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerStarted","Data":"160d07426a5f2cadeafc6d984720608ba07ca9bda2f032d8529ea32348ed52ca"} Feb 01 07:10:30 crc kubenswrapper[5127]: E0201 07:10:30.172579 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod474c7fb9_fcd9_48aa_9287_d114245f9a63.slice/crio-9ebedb3601edff6953a3f5914e679521ad804bcce6579c331cbed329ce14c312\": RecentStats: unable to find data in memory cache]" Feb 01 07:10:30 crc kubenswrapper[5127]: I0201 07:10:30.269190 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b755fb4b-1983-4bfb-91eb-f3f7d00764b4" path="/var/lib/kubelet/pods/b755fb4b-1983-4bfb-91eb-f3f7d00764b4/volumes" Feb 01 07:10:30 crc kubenswrapper[5127]: I0201 07:10:30.666909 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerStarted","Data":"00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f"} Feb 01 07:10:31 crc kubenswrapper[5127]: I0201 07:10:31.684931 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerStarted","Data":"1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46"} Feb 01 07:10:31 crc kubenswrapper[5127]: I0201 07:10:31.686216 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerStarted","Data":"7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568"} Feb 01 07:10:32 crc kubenswrapper[5127]: E0201 07:10:32.989918 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:32 crc kubenswrapper[5127]: E0201 07:10:32.993986 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:32 crc kubenswrapper[5127]: E0201 07:10:32.995872 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:32 crc kubenswrapper[5127]: E0201 07:10:32.995914 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:33 crc kubenswrapper[5127]: I0201 07:10:33.709328 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerStarted","Data":"cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a"} Feb 01 07:10:33 crc kubenswrapper[5127]: I0201 07:10:33.710097 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:10:33 crc kubenswrapper[5127]: I0201 07:10:33.758107 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.902752716 podStartE2EDuration="5.758080094s" podCreationTimestamp="2026-02-01 07:10:28 +0000 UTC" firstStartedPulling="2026-02-01 07:10:29.50367064 +0000 UTC m=+1379.989573023" lastFinishedPulling="2026-02-01 07:10:33.358998038 +0000 UTC m=+1383.844900401" observedRunningTime="2026-02-01 07:10:33.732331185 +0000 UTC m=+1384.218233578" watchObservedRunningTime="2026-02-01 07:10:33.758080094 +0000 UTC m=+1384.243982497" Feb 01 07:10:37 crc kubenswrapper[5127]: E0201 07:10:37.989294 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:37 crc kubenswrapper[5127]: E0201 07:10:37.992889 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:37 crc kubenswrapper[5127]: E0201 07:10:37.994728 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:37 crc kubenswrapper[5127]: E0201 07:10:37.994809 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:38 crc kubenswrapper[5127]: E0201 07:10:38.917780 5127 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/4c5f4ee20bd514914b1a3e721846ab5d10bb77d5f8cc71201cbe3ebace913a74/diff" to get inode usage: stat /var/lib/containers/storage/overlay/4c5f4ee20bd514914b1a3e721846ab5d10bb77d5f8cc71201cbe3ebace913a74/diff: no such file or directory, extraDiskErr: Feb 01 07:10:42 crc kubenswrapper[5127]: E0201 07:10:42.994517 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:42 crc kubenswrapper[5127]: E0201 07:10:42.997792 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:42 crc kubenswrapper[5127]: E0201 07:10:42.999077 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:42 crc kubenswrapper[5127]: E0201 07:10:42.999131 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:47 crc kubenswrapper[5127]: E0201 07:10:47.989597 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:47 crc kubenswrapper[5127]: E0201 07:10:47.991951 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:47 crc kubenswrapper[5127]: E0201 07:10:47.993781 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:47 crc kubenswrapper[5127]: E0201 07:10:47.993816 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:52 crc kubenswrapper[5127]: E0201 07:10:52.990956 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:52 crc kubenswrapper[5127]: E0201 07:10:52.994518 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:52 crc kubenswrapper[5127]: E0201 07:10:52.997012 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:10:52 crc kubenswrapper[5127]: E0201 07:10:52.997128 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:55 crc kubenswrapper[5127]: E0201 07:10:55.965390 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814a0b0d_2a0b_49fa_9477_65d1056b2c3e.slice/crio-4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod814a0b0d_2a0b_49fa_9477_65d1056b2c3e.slice/crio-conmon-4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba.scope\": RecentStats: unable to find data in memory cache]" Feb 01 07:10:55 crc kubenswrapper[5127]: I0201 07:10:55.994358 5127 generic.go:334] "Generic (PLEG): container finished" podID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" exitCode=137 Feb 01 07:10:55 crc kubenswrapper[5127]: I0201 07:10:55.994403 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"814a0b0d-2a0b-49fa-9477-65d1056b2c3e","Type":"ContainerDied","Data":"4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba"} Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.130075 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.275463 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-combined-ca-bundle\") pod \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.275748 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcm8l\" (UniqueName: \"kubernetes.io/projected/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-kube-api-access-rcm8l\") pod \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.275909 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-config-data\") pod \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\" (UID: \"814a0b0d-2a0b-49fa-9477-65d1056b2c3e\") " Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.289360 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-kube-api-access-rcm8l" (OuterVolumeSpecName: "kube-api-access-rcm8l") pod "814a0b0d-2a0b-49fa-9477-65d1056b2c3e" (UID: "814a0b0d-2a0b-49fa-9477-65d1056b2c3e"). InnerVolumeSpecName "kube-api-access-rcm8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.308757 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-config-data" (OuterVolumeSpecName: "config-data") pod "814a0b0d-2a0b-49fa-9477-65d1056b2c3e" (UID: "814a0b0d-2a0b-49fa-9477-65d1056b2c3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.326100 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "814a0b0d-2a0b-49fa-9477-65d1056b2c3e" (UID: "814a0b0d-2a0b-49fa-9477-65d1056b2c3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.379144 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcm8l\" (UniqueName: \"kubernetes.io/projected/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-kube-api-access-rcm8l\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.379194 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:56 crc kubenswrapper[5127]: I0201 07:10:56.379213 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/814a0b0d-2a0b-49fa-9477-65d1056b2c3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.009617 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"814a0b0d-2a0b-49fa-9477-65d1056b2c3e","Type":"ContainerDied","Data":"f77aa53baaec039802ebfea1d73e469c4ef25eafb0fe722622e73b7b3456421a"} Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.009663 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.009702 5127 scope.go:117] "RemoveContainer" containerID="4a5824ee8b829b61a2e982788d0cadd72f65e07b6a913fee6c5bc5e896909cba" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.046387 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.059477 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.079531 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:57 crc kubenswrapper[5127]: E0201 07:10:57.080913 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.081075 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.081386 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" containerName="nova-cell0-conductor-conductor" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.082233 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.084399 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xdhhh" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.085239 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.094848 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.200413 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.200483 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mldtz\" (UniqueName: \"kubernetes.io/projected/4bdc00fa-e725-42ac-8336-e78b107b64e6-kube-api-access-mldtz\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.200778 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.302808 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.303208 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mldtz\" (UniqueName: \"kubernetes.io/projected/4bdc00fa-e725-42ac-8336-e78b107b64e6-kube-api-access-mldtz\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.303541 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.311510 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.317115 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.323476 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mldtz\" (UniqueName: \"kubernetes.io/projected/4bdc00fa-e725-42ac-8336-e78b107b64e6-kube-api-access-mldtz\") pod \"nova-cell0-conductor-0\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.415810 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:57 crc kubenswrapper[5127]: I0201 07:10:57.964467 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:10:58 crc kubenswrapper[5127]: I0201 07:10:58.045976 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4bdc00fa-e725-42ac-8336-e78b107b64e6","Type":"ContainerStarted","Data":"362d0588973ae23a05220b077099960259624f1e79fbb73e64e23d8616d697c4"} Feb 01 07:10:58 crc kubenswrapper[5127]: I0201 07:10:58.258176 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814a0b0d-2a0b-49fa-9477-65d1056b2c3e" path="/var/lib/kubelet/pods/814a0b0d-2a0b-49fa-9477-65d1056b2c3e/volumes" Feb 01 07:10:59 crc kubenswrapper[5127]: I0201 07:10:59.059985 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4bdc00fa-e725-42ac-8336-e78b107b64e6","Type":"ContainerStarted","Data":"07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4"} Feb 01 07:10:59 crc kubenswrapper[5127]: I0201 07:10:59.060449 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 01 07:10:59 crc kubenswrapper[5127]: I0201 07:10:59.060531 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 07:10:59 crc kubenswrapper[5127]: I0201 07:10:59.089548 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.089528241 podStartE2EDuration="2.089528241s" podCreationTimestamp="2026-02-01 07:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:10:59.085045503 +0000 UTC m=+1409.570947916" watchObservedRunningTime="2026-02-01 07:10:59.089528241 +0000 UTC m=+1409.575430604" Feb 01 07:11:02 crc kubenswrapper[5127]: I0201 07:11:02.794251 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:11:02 crc kubenswrapper[5127]: I0201 07:11:02.795372 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf" containerName="kube-state-metrics" containerID="cri-o://68db31784fbb766278c199e8e73106a300362c657897152ed214f52dcf6d04fa" gracePeriod=30 Feb 01 07:11:03 crc kubenswrapper[5127]: I0201 07:11:03.103175 5127 generic.go:334] "Generic (PLEG): container finished" podID="cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf" containerID="68db31784fbb766278c199e8e73106a300362c657897152ed214f52dcf6d04fa" exitCode=2 Feb 01 07:11:03 crc kubenswrapper[5127]: I0201 07:11:03.103228 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf","Type":"ContainerDied","Data":"68db31784fbb766278c199e8e73106a300362c657897152ed214f52dcf6d04fa"} Feb 01 07:11:03 crc kubenswrapper[5127]: I0201 07:11:03.299991 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:11:03 crc kubenswrapper[5127]: I0201 07:11:03.447361 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szfpc\" (UniqueName: \"kubernetes.io/projected/cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf-kube-api-access-szfpc\") pod \"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf\" (UID: \"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf\") " Feb 01 07:11:03 crc kubenswrapper[5127]: I0201 07:11:03.455098 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf-kube-api-access-szfpc" (OuterVolumeSpecName: "kube-api-access-szfpc") pod "cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf" (UID: "cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf"). InnerVolumeSpecName "kube-api-access-szfpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:03 crc kubenswrapper[5127]: I0201 07:11:03.550322 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szfpc\" (UniqueName: \"kubernetes.io/projected/cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf-kube-api-access-szfpc\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.115849 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf","Type":"ContainerDied","Data":"2cc9dc9e4a17064cc38d9068fcaab64cdd0588ffc8e0f3a8abe5165aefc4c9d7"} Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.116285 5127 scope.go:117] "RemoveContainer" containerID="68db31784fbb766278c199e8e73106a300362c657897152ed214f52dcf6d04fa" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.116011 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.156912 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.176831 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.187898 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:11:04 crc kubenswrapper[5127]: E0201 07:11:04.188386 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf" containerName="kube-state-metrics" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.188407 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf" containerName="kube-state-metrics" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.188669 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf" containerName="kube-state-metrics" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.189486 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.192140 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.195328 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.199959 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.252861 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf" path="/var/lib/kubelet/pods/cc1ce7a9-1673-4b47-842c-e6a4b5db9ddf/volumes" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.365991 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.366236 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.366351 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.366434 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjsbn\" (UniqueName: \"kubernetes.io/projected/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-api-access-pjsbn\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.468618 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.468666 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjsbn\" (UniqueName: \"kubernetes.io/projected/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-api-access-pjsbn\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.468799 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.468823 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.476240 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.477130 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.477680 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.491449 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjsbn\" (UniqueName: \"kubernetes.io/projected/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-api-access-pjsbn\") pod \"kube-state-metrics-0\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " pod="openstack/kube-state-metrics-0" Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.551892 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.552451 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-central-agent" containerID="cri-o://00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f" gracePeriod=30 Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.552814 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="proxy-httpd" containerID="cri-o://cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a" gracePeriod=30 Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.552842 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-notification-agent" containerID="cri-o://7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568" gracePeriod=30 Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.552941 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="sg-core" containerID="cri-o://1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46" gracePeriod=30 Feb 01 07:11:04 crc kubenswrapper[5127]: I0201 07:11:04.553427 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.064183 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.125557 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd0a3f5a-2119-403c-8b4c-e452465a71e8","Type":"ContainerStarted","Data":"6ec24b098801298a2c0a06dff7d2350137852098ef7c5015f145b8edf2995c50"} Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.129613 5127 generic.go:334] "Generic (PLEG): container finished" podID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerID="cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a" exitCode=0 Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.129643 5127 generic.go:334] "Generic (PLEG): container finished" podID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerID="1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46" exitCode=2 Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.129651 5127 generic.go:334] "Generic (PLEG): container finished" podID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerID="00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f" exitCode=0 Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.129672 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerDied","Data":"cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a"} Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.129696 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerDied","Data":"1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46"} Feb 01 07:11:05 crc kubenswrapper[5127]: I0201 07:11:05.129705 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerDied","Data":"00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f"} Feb 01 07:11:06 crc kubenswrapper[5127]: I0201 07:11:06.141383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd0a3f5a-2119-403c-8b4c-e452465a71e8","Type":"ContainerStarted","Data":"2c5095bb5c19bb3463f1baf571331379f62fe6f5cfacef8ce4ddbb8ec37e07f6"} Feb 01 07:11:06 crc kubenswrapper[5127]: I0201 07:11:06.141713 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 01 07:11:06 crc kubenswrapper[5127]: I0201 07:11:06.164831 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.781164897 podStartE2EDuration="2.164794855s" podCreationTimestamp="2026-02-01 07:11:04 +0000 UTC" firstStartedPulling="2026-02-01 07:11:05.075234836 +0000 UTC m=+1415.561137199" lastFinishedPulling="2026-02-01 07:11:05.458864794 +0000 UTC m=+1415.944767157" observedRunningTime="2026-02-01 07:11:06.162839644 +0000 UTC m=+1416.648742027" watchObservedRunningTime="2026-02-01 07:11:06.164794855 +0000 UTC m=+1416.650697258" Feb 01 07:11:06 crc kubenswrapper[5127]: I0201 07:11:06.740890 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:11:06 crc kubenswrapper[5127]: I0201 07:11:06.741223 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:11:06 crc kubenswrapper[5127]: I0201 07:11:06.915407 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.024812 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-run-httpd\") pod \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.025071 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-config-data\") pod \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.025152 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-sg-core-conf-yaml\") pod \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.025250 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-log-httpd\") pod \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.025368 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-scripts\") pod \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.025455 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45qct\" (UniqueName: \"kubernetes.io/projected/9f5a66ae-021d-43bb-ba2c-74aa344c3334-kube-api-access-45qct\") pod \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.025501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9f5a66ae-021d-43bb-ba2c-74aa344c3334" (UID: "9f5a66ae-021d-43bb-ba2c-74aa344c3334"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.025640 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-combined-ca-bundle\") pod \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\" (UID: \"9f5a66ae-021d-43bb-ba2c-74aa344c3334\") " Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.026013 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9f5a66ae-021d-43bb-ba2c-74aa344c3334" (UID: "9f5a66ae-021d-43bb-ba2c-74aa344c3334"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.026328 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.026412 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9f5a66ae-021d-43bb-ba2c-74aa344c3334-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.030934 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5a66ae-021d-43bb-ba2c-74aa344c3334-kube-api-access-45qct" (OuterVolumeSpecName: "kube-api-access-45qct") pod "9f5a66ae-021d-43bb-ba2c-74aa344c3334" (UID: "9f5a66ae-021d-43bb-ba2c-74aa344c3334"). InnerVolumeSpecName "kube-api-access-45qct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.030996 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-scripts" (OuterVolumeSpecName: "scripts") pod "9f5a66ae-021d-43bb-ba2c-74aa344c3334" (UID: "9f5a66ae-021d-43bb-ba2c-74aa344c3334"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.077309 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9f5a66ae-021d-43bb-ba2c-74aa344c3334" (UID: "9f5a66ae-021d-43bb-ba2c-74aa344c3334"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.102782 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f5a66ae-021d-43bb-ba2c-74aa344c3334" (UID: "9f5a66ae-021d-43bb-ba2c-74aa344c3334"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.128533 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.128626 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.128638 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45qct\" (UniqueName: \"kubernetes.io/projected/9f5a66ae-021d-43bb-ba2c-74aa344c3334-kube-api-access-45qct\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.128649 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.137298 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-config-data" (OuterVolumeSpecName: "config-data") pod "9f5a66ae-021d-43bb-ba2c-74aa344c3334" (UID: "9f5a66ae-021d-43bb-ba2c-74aa344c3334"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.150817 5127 generic.go:334] "Generic (PLEG): container finished" podID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerID="7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568" exitCode=0 Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.150868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerDied","Data":"7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568"} Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.150925 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9f5a66ae-021d-43bb-ba2c-74aa344c3334","Type":"ContainerDied","Data":"160d07426a5f2cadeafc6d984720608ba07ca9bda2f032d8529ea32348ed52ca"} Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.150947 5127 scope.go:117] "RemoveContainer" containerID="cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.150968 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.171353 5127 scope.go:117] "RemoveContainer" containerID="1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.193049 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.203423 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.210167 5127 scope.go:117] "RemoveContainer" containerID="7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.223463 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.224019 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="proxy-httpd" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224046 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="proxy-httpd" Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.224070 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="sg-core" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224083 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="sg-core" Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.224121 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-notification-agent" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224130 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-notification-agent" Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.224140 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-central-agent" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224151 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-central-agent" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224372 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-notification-agent" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224390 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="sg-core" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224410 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="proxy-httpd" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.224423 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" containerName="ceilometer-central-agent" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.227211 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.230423 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f5a66ae-021d-43bb-ba2c-74aa344c3334-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.231026 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.231270 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.231454 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.231571 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.235240 5127 scope.go:117] "RemoveContainer" containerID="00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.259211 5127 scope.go:117] "RemoveContainer" containerID="cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a" Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.259656 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a\": container with ID starting with cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a not found: ID does not exist" containerID="cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.259696 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a"} err="failed to get container status \"cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a\": rpc error: code = NotFound desc = could not find container \"cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a\": container with ID starting with cf6cdab1cb9f5a64074ce1b6ef6a85c20434a881f538275f6661172f35be431a not found: ID does not exist" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.259724 5127 scope.go:117] "RemoveContainer" containerID="1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46" Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.260087 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46\": container with ID starting with 1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46 not found: ID does not exist" containerID="1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.260125 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46"} err="failed to get container status \"1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46\": rpc error: code = NotFound desc = could not find container \"1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46\": container with ID starting with 1386f6d70f21b7ee33937fd9fc9308dbf09bd8ce021dddd70a5d197086638f46 not found: ID does not exist" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.260149 5127 scope.go:117] "RemoveContainer" containerID="7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568" Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.260423 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568\": container with ID starting with 7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568 not found: ID does not exist" containerID="7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.260450 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568"} err="failed to get container status \"7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568\": rpc error: code = NotFound desc = could not find container \"7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568\": container with ID starting with 7bfe9c3208361a522b15908dc8dbf25983ef1842a6ddc69f1fb1615f2c865568 not found: ID does not exist" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.260472 5127 scope.go:117] "RemoveContainer" containerID="00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f" Feb 01 07:11:07 crc kubenswrapper[5127]: E0201 07:11:07.260735 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f\": container with ID starting with 00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f not found: ID does not exist" containerID="00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.260758 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f"} err="failed to get container status \"00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f\": rpc error: code = NotFound desc = could not find container \"00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f\": container with ID starting with 00c9de7728e86d92e3ad9c8208eafdfa04fa7e39740e1c5d5270ed1986f2946f not found: ID does not exist" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.331980 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.332070 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-run-httpd\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.332115 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wn4z\" (UniqueName: \"kubernetes.io/projected/70bc113f-723b-4328-9e57-6be9ae93b5cb-kube-api-access-8wn4z\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.332195 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.332248 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.332285 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-log-httpd\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.332480 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-scripts\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.332613 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-config-data\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434108 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-scripts\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434177 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-config-data\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434289 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434326 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-run-httpd\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434362 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wn4z\" (UniqueName: \"kubernetes.io/projected/70bc113f-723b-4328-9e57-6be9ae93b5cb-kube-api-access-8wn4z\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434410 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434442 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434471 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-log-httpd\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.434976 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-run-httpd\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.435275 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-log-httpd\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.438159 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-scripts\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.440323 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.441740 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.442375 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-config-data\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.442971 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.452656 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.469737 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wn4z\" (UniqueName: \"kubernetes.io/projected/70bc113f-723b-4328-9e57-6be9ae93b5cb-kube-api-access-8wn4z\") pod \"ceilometer-0\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.551125 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.932234 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5ng2k"] Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.933648 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.935989 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.936080 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 01 07:11:07 crc kubenswrapper[5127]: I0201 07:11:07.948037 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5ng2k"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.049049 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-config-data\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.049088 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-scripts\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.049108 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.049204 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6phrc\" (UniqueName: \"kubernetes.io/projected/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-kube-api-access-6phrc\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.093054 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: W0201 07:11:08.103735 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70bc113f_723b_4328_9e57_6be9ae93b5cb.slice/crio-d884b3cd2fb8974fd02b7b89b10d060c906b1dbbb35b309116f7414ff1f821a4 WatchSource:0}: Error finding container d884b3cd2fb8974fd02b7b89b10d060c906b1dbbb35b309116f7414ff1f821a4: Status 404 returned error can't find the container with id d884b3cd2fb8974fd02b7b89b10d060c906b1dbbb35b309116f7414ff1f821a4 Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.150505 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-config-data\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.150551 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-scripts\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.150569 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.150674 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6phrc\" (UniqueName: \"kubernetes.io/projected/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-kube-api-access-6phrc\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.162547 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-config-data\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.162568 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.163095 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerStarted","Data":"d884b3cd2fb8974fd02b7b89b10d060c906b1dbbb35b309116f7414ff1f821a4"} Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.163140 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.164570 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.168392 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.169030 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-scripts\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.173799 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6phrc\" (UniqueName: \"kubernetes.io/projected/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-kube-api-access-6phrc\") pod \"nova-cell0-cell-mapping-5ng2k\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.180549 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.207693 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.209383 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.213284 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.228679 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.258059 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5a66ae-021d-43bb-ba2c-74aa344c3334" path="/var/lib/kubelet/pods/9f5a66ae-021d-43bb-ba2c-74aa344c3334/volumes" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.259212 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.332276 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.347273 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.357673 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358086 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8d4p\" (UniqueName: \"kubernetes.io/projected/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-kube-api-access-m8d4p\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358155 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x854z\" (UniqueName: \"kubernetes.io/projected/ce2744a6-29be-47f0-bd03-fb8164b9f47a-kube-api-access-x854z\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358205 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwrcl\" (UniqueName: \"kubernetes.io/projected/4a833731-c214-46d5-966d-ff87d90a6501-kube-api-access-dwrcl\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358271 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-config-data\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358336 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-config-data\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358362 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2744a6-29be-47f0-bd03-fb8164b9f47a-logs\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358402 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-config-data\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358432 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358459 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358477 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-logs\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.358498 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.397616 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.449449 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.453137 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.456751 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.458960 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-config-data\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.458996 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2744a6-29be-47f0-bd03-fb8164b9f47a-logs\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.459031 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-config-data\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460079 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2744a6-29be-47f0-bd03-fb8164b9f47a-logs\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460519 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460547 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460566 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-logs\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460601 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460621 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460642 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8d4p\" (UniqueName: \"kubernetes.io/projected/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-kube-api-access-m8d4p\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460666 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x854z\" (UniqueName: \"kubernetes.io/projected/ce2744a6-29be-47f0-bd03-fb8164b9f47a-kube-api-access-x854z\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460696 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwrcl\" (UniqueName: \"kubernetes.io/projected/4a833731-c214-46d5-966d-ff87d90a6501-kube-api-access-dwrcl\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460740 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460756 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-config-data\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.460788 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqdb7\" (UniqueName: \"kubernetes.io/projected/30fba91a-8a9d-43cd-9d35-5355db347855-kube-api-access-sqdb7\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.464137 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-config-data\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.467476 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-logs\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.469195 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.469645 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-config-data\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.475778 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.476326 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.481366 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.485769 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-config-data\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.487198 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwrcl\" (UniqueName: \"kubernetes.io/projected/4a833731-c214-46d5-966d-ff87d90a6501-kube-api-access-dwrcl\") pod \"nova-scheduler-0\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.488455 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x854z\" (UniqueName: \"kubernetes.io/projected/ce2744a6-29be-47f0-bd03-fb8164b9f47a-kube-api-access-x854z\") pod \"nova-api-0\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.488987 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8d4p\" (UniqueName: \"kubernetes.io/projected/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-kube-api-access-m8d4p\") pod \"nova-metadata-0\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.491452 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-gxls2"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.493040 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.539543 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-gxls2"] Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.556160 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562026 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-config\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562082 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562102 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjwjw\" (UniqueName: \"kubernetes.io/projected/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-kube-api-access-bjwjw\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562155 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562170 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562198 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562213 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562243 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.562276 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqdb7\" (UniqueName: \"kubernetes.io/projected/30fba91a-8a9d-43cd-9d35-5355db347855-kube-api-access-sqdb7\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.568974 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.571821 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.588895 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.593948 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqdb7\" (UniqueName: \"kubernetes.io/projected/30fba91a-8a9d-43cd-9d35-5355db347855-kube-api-access-sqdb7\") pod \"nova-cell1-novncproxy-0\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.670062 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-config\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.670133 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjwjw\" (UniqueName: \"kubernetes.io/projected/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-kube-api-access-bjwjw\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.670193 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.670213 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.670252 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.670274 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.671223 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.671755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-config\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.672504 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.673112 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.673678 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.705475 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjwjw\" (UniqueName: \"kubernetes.io/projected/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-kube-api-access-bjwjw\") pod \"dnsmasq-dns-557bbc7df7-gxls2\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.749673 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.771255 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.822448 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:08 crc kubenswrapper[5127]: I0201 07:11:08.935439 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5ng2k"] Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.024564 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.106823 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2zcqd"] Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.108345 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.110657 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.111096 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.117982 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2zcqd"] Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.148237 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.178177 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwdw\" (UniqueName: \"kubernetes.io/projected/89710835-8f36-4539-90e7-7f442b5fd963-kube-api-access-6pwdw\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.178246 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.178350 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-config-data\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.178402 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-scripts\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.198108 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a833731-c214-46d5-966d-ff87d90a6501","Type":"ContainerStarted","Data":"7a54d5e6c701ecfffde051246dcebda3a0dea9bfbd8ec2f2c6d480379be2f5b4"} Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.207199 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2744a6-29be-47f0-bd03-fb8164b9f47a","Type":"ContainerStarted","Data":"8c4fcbf25f45ddeb42f36ccc4f20b1a9d4c9728db9c136b9ea981107b88aa552"} Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.211142 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerStarted","Data":"3a2a2c655dd29b2aebae70b7233ae2df0528dd625d03162ba4c1c898097310b8"} Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.214541 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5ng2k" event={"ID":"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c","Type":"ContainerStarted","Data":"38a43a5d4ff1be61cb1d21b662a63b20300422bfa11d386d9ecdf7d41d3a1f0e"} Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.231649 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5ng2k" podStartSLOduration=2.231630657 podStartE2EDuration="2.231630657s" podCreationTimestamp="2026-02-01 07:11:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:09.23099721 +0000 UTC m=+1419.716899583" watchObservedRunningTime="2026-02-01 07:11:09.231630657 +0000 UTC m=+1419.717533020" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.279343 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-scripts\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.279416 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwdw\" (UniqueName: \"kubernetes.io/projected/89710835-8f36-4539-90e7-7f442b5fd963-kube-api-access-6pwdw\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.279489 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.281446 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-config-data\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.284502 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-scripts\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.287045 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.288061 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-config-data\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.305184 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwdw\" (UniqueName: \"kubernetes.io/projected/89710835-8f36-4539-90e7-7f442b5fd963-kube-api-access-6pwdw\") pod \"nova-cell1-conductor-db-sync-2zcqd\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.320289 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.338054 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.428563 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-gxls2"] Feb 01 07:11:09 crc kubenswrapper[5127]: W0201 07:11:09.433972 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf272e6a0_6e5c_4b7b_9118_a3ee4adae73f.slice/crio-8e00ee46b69eebbe7901c937874c0135e0b9d350989c6d1a3f7ca4cb25757286 WatchSource:0}: Error finding container 8e00ee46b69eebbe7901c937874c0135e0b9d350989c6d1a3f7ca4cb25757286: Status 404 returned error can't find the container with id 8e00ee46b69eebbe7901c937874c0135e0b9d350989c6d1a3f7ca4cb25757286 Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.437042 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:09 crc kubenswrapper[5127]: I0201 07:11:09.933400 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2zcqd"] Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.243090 5127 generic.go:334] "Generic (PLEG): container finished" podID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerID="cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17" exitCode=0 Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.282983 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" event={"ID":"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f","Type":"ContainerDied","Data":"cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17"} Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.283029 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" event={"ID":"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f","Type":"ContainerStarted","Data":"8e00ee46b69eebbe7901c937874c0135e0b9d350989c6d1a3f7ca4cb25757286"} Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.283056 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" event={"ID":"89710835-8f36-4539-90e7-7f442b5fd963","Type":"ContainerStarted","Data":"66450dde1ce30f6d39c47d2d34e929e4e2b217bd800a51665f22af1caeeb95b2"} Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.283093 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"30fba91a-8a9d-43cd-9d35-5355db347855","Type":"ContainerStarted","Data":"753cd86846d32cae6e27d5b5c68366ab63cd79feeca9d8045c03b0977928555f"} Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.283108 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerStarted","Data":"3e5013ec87f4223b60ab710d38021572bf1e511c76300d29da5408c9c4b53a8b"} Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.283121 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5ng2k" event={"ID":"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c","Type":"ContainerStarted","Data":"fdfa4791e619968d1933af4255d70dd183b5deedea313c73ca0881970b0dfbbb"} Feb 01 07:11:10 crc kubenswrapper[5127]: I0201 07:11:10.283137 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f01359bc-1b04-4ee2-9d82-1b2cb5d69560","Type":"ContainerStarted","Data":"1725824f0685ea1cc81c4e2460822a7a0a3e952c57c4406955f2260eb51a2240"} Feb 01 07:11:11 crc kubenswrapper[5127]: I0201 07:11:11.280738 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" event={"ID":"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f","Type":"ContainerStarted","Data":"7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c"} Feb 01 07:11:11 crc kubenswrapper[5127]: I0201 07:11:11.281015 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:11 crc kubenswrapper[5127]: I0201 07:11:11.282270 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" event={"ID":"89710835-8f36-4539-90e7-7f442b5fd963","Type":"ContainerStarted","Data":"63227caa383783fd418ece07b8dc86363707165b10731d1623e6cffade0f67b5"} Feb 01 07:11:11 crc kubenswrapper[5127]: I0201 07:11:11.289002 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerStarted","Data":"efcbb319cd668430dd4c8ecca3d58d47f9cf2aaaaa4b1ac43810a468a87b7cbd"} Feb 01 07:11:11 crc kubenswrapper[5127]: I0201 07:11:11.301321 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" podStartSLOduration=3.301306613 podStartE2EDuration="3.301306613s" podCreationTimestamp="2026-02-01 07:11:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:11.298955641 +0000 UTC m=+1421.784858004" watchObservedRunningTime="2026-02-01 07:11:11.301306613 +0000 UTC m=+1421.787208976" Feb 01 07:11:11 crc kubenswrapper[5127]: I0201 07:11:11.319556 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" podStartSLOduration=2.319535274 podStartE2EDuration="2.319535274s" podCreationTimestamp="2026-02-01 07:11:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:11.318698181 +0000 UTC m=+1421.804600574" watchObservedRunningTime="2026-02-01 07:11:11.319535274 +0000 UTC m=+1421.805437637" Feb 01 07:11:12 crc kubenswrapper[5127]: I0201 07:11:12.012524 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:12 crc kubenswrapper[5127]: I0201 07:11:12.027220 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.305760 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a833731-c214-46d5-966d-ff87d90a6501","Type":"ContainerStarted","Data":"e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928"} Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.307389 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2744a6-29be-47f0-bd03-fb8164b9f47a","Type":"ContainerStarted","Data":"423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4"} Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.307435 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2744a6-29be-47f0-bd03-fb8164b9f47a","Type":"ContainerStarted","Data":"65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79"} Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.309228 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerStarted","Data":"071759326f699e0c8d2cc65ebcb098be6183ebea05028fcaa8cfb94bb6f1402b"} Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.309395 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.310604 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f01359bc-1b04-4ee2-9d82-1b2cb5d69560","Type":"ContainerStarted","Data":"4f25c38fc76a3e6c684ac3c80d888d8b615e7c6b31a8e4ff08ed1e377cd941ee"} Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.310641 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f01359bc-1b04-4ee2-9d82-1b2cb5d69560","Type":"ContainerStarted","Data":"cb86fe852f05b7a0fca92ae79cf48f3636bdb610ba6973e39a8b87ac41355a27"} Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.310755 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-metadata" containerID="cri-o://4f25c38fc76a3e6c684ac3c80d888d8b615e7c6b31a8e4ff08ed1e377cd941ee" gracePeriod=30 Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.310704 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-log" containerID="cri-o://cb86fe852f05b7a0fca92ae79cf48f3636bdb610ba6973e39a8b87ac41355a27" gracePeriod=30 Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.314947 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"30fba91a-8a9d-43cd-9d35-5355db347855","Type":"ContainerStarted","Data":"21dac6cdee8fcbf4721f8b5699192ce6257027831a12acb80385cdf434ea81b6"} Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.315153 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="30fba91a-8a9d-43cd-9d35-5355db347855" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://21dac6cdee8fcbf4721f8b5699192ce6257027831a12acb80385cdf434ea81b6" gracePeriod=30 Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.334811 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.901199111 podStartE2EDuration="5.334792646s" podCreationTimestamp="2026-02-01 07:11:08 +0000 UTC" firstStartedPulling="2026-02-01 07:11:09.04236445 +0000 UTC m=+1419.528266813" lastFinishedPulling="2026-02-01 07:11:12.475957965 +0000 UTC m=+1422.961860348" observedRunningTime="2026-02-01 07:11:13.325939373 +0000 UTC m=+1423.811841736" watchObservedRunningTime="2026-02-01 07:11:13.334792646 +0000 UTC m=+1423.820695009" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.357656 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9617847560000001 podStartE2EDuration="6.357629358s" podCreationTimestamp="2026-02-01 07:11:07 +0000 UTC" firstStartedPulling="2026-02-01 07:11:08.106184761 +0000 UTC m=+1418.592087124" lastFinishedPulling="2026-02-01 07:11:12.502029363 +0000 UTC m=+1422.987931726" observedRunningTime="2026-02-01 07:11:13.352483782 +0000 UTC m=+1423.838386145" watchObservedRunningTime="2026-02-01 07:11:13.357629358 +0000 UTC m=+1423.843531721" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.374170 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.061446632 podStartE2EDuration="5.374003919s" podCreationTimestamp="2026-02-01 07:11:08 +0000 UTC" firstStartedPulling="2026-02-01 07:11:09.16232063 +0000 UTC m=+1419.648222993" lastFinishedPulling="2026-02-01 07:11:12.474877907 +0000 UTC m=+1422.960780280" observedRunningTime="2026-02-01 07:11:13.367641972 +0000 UTC m=+1423.853544335" watchObservedRunningTime="2026-02-01 07:11:13.374003919 +0000 UTC m=+1423.859906282" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.387254 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.2484926610000002 podStartE2EDuration="5.387231168s" podCreationTimestamp="2026-02-01 07:11:08 +0000 UTC" firstStartedPulling="2026-02-01 07:11:09.327137793 +0000 UTC m=+1419.813040156" lastFinishedPulling="2026-02-01 07:11:12.46587626 +0000 UTC m=+1422.951778663" observedRunningTime="2026-02-01 07:11:13.383721155 +0000 UTC m=+1423.869623518" watchObservedRunningTime="2026-02-01 07:11:13.387231168 +0000 UTC m=+1423.873133531" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.408788 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.25381248 podStartE2EDuration="5.408768545s" podCreationTimestamp="2026-02-01 07:11:08 +0000 UTC" firstStartedPulling="2026-02-01 07:11:09.335602176 +0000 UTC m=+1419.821504529" lastFinishedPulling="2026-02-01 07:11:12.490558231 +0000 UTC m=+1422.976460594" observedRunningTime="2026-02-01 07:11:13.404114953 +0000 UTC m=+1423.890017316" watchObservedRunningTime="2026-02-01 07:11:13.408768545 +0000 UTC m=+1423.894670898" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.589973 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.750570 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.750637 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:11:13 crc kubenswrapper[5127]: I0201 07:11:13.772030 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:14 crc kubenswrapper[5127]: I0201 07:11:14.332230 5127 generic.go:334] "Generic (PLEG): container finished" podID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerID="cb86fe852f05b7a0fca92ae79cf48f3636bdb610ba6973e39a8b87ac41355a27" exitCode=143 Feb 01 07:11:14 crc kubenswrapper[5127]: I0201 07:11:14.332302 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f01359bc-1b04-4ee2-9d82-1b2cb5d69560","Type":"ContainerDied","Data":"cb86fe852f05b7a0fca92ae79cf48f3636bdb610ba6973e39a8b87ac41355a27"} Feb 01 07:11:14 crc kubenswrapper[5127]: I0201 07:11:14.567393 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 01 07:11:17 crc kubenswrapper[5127]: I0201 07:11:17.368491 5127 generic.go:334] "Generic (PLEG): container finished" podID="6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" containerID="fdfa4791e619968d1933af4255d70dd183b5deedea313c73ca0881970b0dfbbb" exitCode=0 Feb 01 07:11:17 crc kubenswrapper[5127]: I0201 07:11:17.368783 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5ng2k" event={"ID":"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c","Type":"ContainerDied","Data":"fdfa4791e619968d1933af4255d70dd183b5deedea313c73ca0881970b0dfbbb"} Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.379718 5127 generic.go:334] "Generic (PLEG): container finished" podID="89710835-8f36-4539-90e7-7f442b5fd963" containerID="63227caa383783fd418ece07b8dc86363707165b10731d1623e6cffade0f67b5" exitCode=0 Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.379837 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" event={"ID":"89710835-8f36-4539-90e7-7f442b5fd963","Type":"ContainerDied","Data":"63227caa383783fd418ece07b8dc86363707165b10731d1623e6cffade0f67b5"} Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.556996 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.557071 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.590220 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.621190 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.732648 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.824725 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.898634 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-lm7b9"] Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.898901 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" podUID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerName="dnsmasq-dns" containerID="cri-o://bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa" gracePeriod=10 Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.899302 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-combined-ca-bundle\") pod \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.899498 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-config-data\") pod \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.899547 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-scripts\") pod \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.899634 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6phrc\" (UniqueName: \"kubernetes.io/projected/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-kube-api-access-6phrc\") pod \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\" (UID: \"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c\") " Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.910559 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-kube-api-access-6phrc" (OuterVolumeSpecName: "kube-api-access-6phrc") pod "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" (UID: "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c"). InnerVolumeSpecName "kube-api-access-6phrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.912468 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-scripts" (OuterVolumeSpecName: "scripts") pod "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" (UID: "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.944815 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-config-data" (OuterVolumeSpecName: "config-data") pod "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" (UID: "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:18 crc kubenswrapper[5127]: I0201 07:11:18.968104 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" (UID: "6e62122a-43cd-4d84-a3b9-2ab7472dbf1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.002959 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.003012 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6phrc\" (UniqueName: \"kubernetes.io/projected/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-kube-api-access-6phrc\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.003026 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.003038 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.321270 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.388286 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5ng2k" event={"ID":"6e62122a-43cd-4d84-a3b9-2ab7472dbf1c","Type":"ContainerDied","Data":"38a43a5d4ff1be61cb1d21b662a63b20300422bfa11d386d9ecdf7d41d3a1f0e"} Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.388323 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a43a5d4ff1be61cb1d21b662a63b20300422bfa11d386d9ecdf7d41d3a1f0e" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.388393 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5ng2k" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.411293 5127 generic.go:334] "Generic (PLEG): container finished" podID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerID="bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa" exitCode=0 Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.411667 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" event={"ID":"28fa3b9a-8a1d-4954-89eb-6bf203c729d2","Type":"ContainerDied","Data":"bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa"} Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.411713 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" event={"ID":"28fa3b9a-8a1d-4954-89eb-6bf203c729d2","Type":"ContainerDied","Data":"ca21adb9e284a0633afbc3c173e31c2cdefe126ac7fe4580fc56eae716e78b85"} Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.411733 5127 scope.go:117] "RemoveContainer" containerID="bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.416638 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-lm7b9" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.453144 5127 scope.go:117] "RemoveContainer" containerID="fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.467778 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.479531 5127 scope.go:117] "RemoveContainer" containerID="bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa" Feb 01 07:11:19 crc kubenswrapper[5127]: E0201 07:11:19.480012 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa\": container with ID starting with bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa not found: ID does not exist" containerID="bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.480055 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa"} err="failed to get container status \"bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa\": rpc error: code = NotFound desc = could not find container \"bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa\": container with ID starting with bb71921dd91d8f562e31171325ada66b49d54b3ed7d4a14481385a94d0c98aaa not found: ID does not exist" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.480080 5127 scope.go:117] "RemoveContainer" containerID="fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02" Feb 01 07:11:19 crc kubenswrapper[5127]: E0201 07:11:19.480348 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02\": container with ID starting with fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02 not found: ID does not exist" containerID="fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.480374 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02"} err="failed to get container status \"fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02\": rpc error: code = NotFound desc = could not find container \"fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02\": container with ID starting with fcc399e6f01d7bbd545aac2a3af4b1edd06563d8c72e08bcf23dcd560454ba02 not found: ID does not exist" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.514429 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-svc\") pod \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.514489 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-config\") pod \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.514542 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-nb\") pod \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.514561 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-sb\") pod \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.514692 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45p7k\" (UniqueName: \"kubernetes.io/projected/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-kube-api-access-45p7k\") pod \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.514743 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-swift-storage-0\") pod \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\" (UID: \"28fa3b9a-8a1d-4954-89eb-6bf203c729d2\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.519455 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-kube-api-access-45p7k" (OuterVolumeSpecName: "kube-api-access-45p7k") pod "28fa3b9a-8a1d-4954-89eb-6bf203c729d2" (UID: "28fa3b9a-8a1d-4954-89eb-6bf203c729d2"). InnerVolumeSpecName "kube-api-access-45p7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.592342 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.592551 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-log" containerID="cri-o://65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79" gracePeriod=30 Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.593530 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-api" containerID="cri-o://423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4" gracePeriod=30 Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.601230 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.604790 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.617430 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45p7k\" (UniqueName: \"kubernetes.io/projected/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-kube-api-access-45p7k\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.619083 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28fa3b9a-8a1d-4954-89eb-6bf203c729d2" (UID: "28fa3b9a-8a1d-4954-89eb-6bf203c729d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.627011 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28fa3b9a-8a1d-4954-89eb-6bf203c729d2" (UID: "28fa3b9a-8a1d-4954-89eb-6bf203c729d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.641240 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "28fa3b9a-8a1d-4954-89eb-6bf203c729d2" (UID: "28fa3b9a-8a1d-4954-89eb-6bf203c729d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.655709 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "28fa3b9a-8a1d-4954-89eb-6bf203c729d2" (UID: "28fa3b9a-8a1d-4954-89eb-6bf203c729d2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.659201 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-config" (OuterVolumeSpecName: "config") pod "28fa3b9a-8a1d-4954-89eb-6bf203c729d2" (UID: "28fa3b9a-8a1d-4954-89eb-6bf203c729d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.719395 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.719434 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.719442 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.719450 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.719459 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28fa3b9a-8a1d-4954-89eb-6bf203c729d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.838118 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.855369 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-lm7b9"] Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.866558 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-lm7b9"] Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.922374 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-combined-ca-bundle\") pod \"89710835-8f36-4539-90e7-7f442b5fd963\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.922715 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pwdw\" (UniqueName: \"kubernetes.io/projected/89710835-8f36-4539-90e7-7f442b5fd963-kube-api-access-6pwdw\") pod \"89710835-8f36-4539-90e7-7f442b5fd963\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.922805 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-scripts\") pod \"89710835-8f36-4539-90e7-7f442b5fd963\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.922986 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-config-data\") pod \"89710835-8f36-4539-90e7-7f442b5fd963\" (UID: \"89710835-8f36-4539-90e7-7f442b5fd963\") " Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.925827 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-scripts" (OuterVolumeSpecName: "scripts") pod "89710835-8f36-4539-90e7-7f442b5fd963" (UID: "89710835-8f36-4539-90e7-7f442b5fd963"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.925870 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89710835-8f36-4539-90e7-7f442b5fd963-kube-api-access-6pwdw" (OuterVolumeSpecName: "kube-api-access-6pwdw") pod "89710835-8f36-4539-90e7-7f442b5fd963" (UID: "89710835-8f36-4539-90e7-7f442b5fd963"). InnerVolumeSpecName "kube-api-access-6pwdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.959662 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89710835-8f36-4539-90e7-7f442b5fd963" (UID: "89710835-8f36-4539-90e7-7f442b5fd963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:19 crc kubenswrapper[5127]: I0201 07:11:19.960986 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-config-data" (OuterVolumeSpecName: "config-data") pod "89710835-8f36-4539-90e7-7f442b5fd963" (UID: "89710835-8f36-4539-90e7-7f442b5fd963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.019637 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.025933 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.025958 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.025968 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89710835-8f36-4539-90e7-7f442b5fd963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.025977 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pwdw\" (UniqueName: \"kubernetes.io/projected/89710835-8f36-4539-90e7-7f442b5fd963-kube-api-access-6pwdw\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.247841 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" path="/var/lib/kubelet/pods/28fa3b9a-8a1d-4954-89eb-6bf203c729d2/volumes" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.424551 5127 generic.go:334] "Generic (PLEG): container finished" podID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerID="65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79" exitCode=143 Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.424659 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2744a6-29be-47f0-bd03-fb8164b9f47a","Type":"ContainerDied","Data":"65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79"} Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.431490 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.432346 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2zcqd" event={"ID":"89710835-8f36-4539-90e7-7f442b5fd963","Type":"ContainerDied","Data":"66450dde1ce30f6d39c47d2d34e929e4e2b217bd800a51665f22af1caeeb95b2"} Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.432380 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66450dde1ce30f6d39c47d2d34e929e4e2b217bd800a51665f22af1caeeb95b2" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.486639 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 07:11:20 crc kubenswrapper[5127]: E0201 07:11:20.487066 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89710835-8f36-4539-90e7-7f442b5fd963" containerName="nova-cell1-conductor-db-sync" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.487084 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89710835-8f36-4539-90e7-7f442b5fd963" containerName="nova-cell1-conductor-db-sync" Feb 01 07:11:20 crc kubenswrapper[5127]: E0201 07:11:20.487099 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerName="dnsmasq-dns" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.487108 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerName="dnsmasq-dns" Feb 01 07:11:20 crc kubenswrapper[5127]: E0201 07:11:20.487129 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" containerName="nova-manage" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.487139 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" containerName="nova-manage" Feb 01 07:11:20 crc kubenswrapper[5127]: E0201 07:11:20.487153 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerName="init" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.487159 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerName="init" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.487383 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" containerName="nova-manage" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.487414 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="28fa3b9a-8a1d-4954-89eb-6bf203c729d2" containerName="dnsmasq-dns" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.487436 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="89710835-8f36-4539-90e7-7f442b5fd963" containerName="nova-cell1-conductor-db-sync" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.488187 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.491007 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.512157 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.640286 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2nqv\" (UniqueName: \"kubernetes.io/projected/9df5c029-e707-4159-b8ec-2fb5dba38094-kube-api-access-x2nqv\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.640355 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.640385 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.742095 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2nqv\" (UniqueName: \"kubernetes.io/projected/9df5c029-e707-4159-b8ec-2fb5dba38094-kube-api-access-x2nqv\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.742161 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.742188 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.770032 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.770640 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2nqv\" (UniqueName: \"kubernetes.io/projected/9df5c029-e707-4159-b8ec-2fb5dba38094-kube-api-access-x2nqv\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.771382 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:20 crc kubenswrapper[5127]: I0201 07:11:20.808000 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:21 crc kubenswrapper[5127]: I0201 07:11:21.338198 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 07:11:21 crc kubenswrapper[5127]: I0201 07:11:21.444665 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9df5c029-e707-4159-b8ec-2fb5dba38094","Type":"ContainerStarted","Data":"8c1c820ec5d7e58917ecb2d3c8c7355c70f4fcebf4b5aabdb47db8afcd051217"} Feb 01 07:11:21 crc kubenswrapper[5127]: I0201 07:11:21.444876 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4a833731-c214-46d5-966d-ff87d90a6501" containerName="nova-scheduler-scheduler" containerID="cri-o://e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928" gracePeriod=30 Feb 01 07:11:22 crc kubenswrapper[5127]: I0201 07:11:22.462469 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9df5c029-e707-4159-b8ec-2fb5dba38094","Type":"ContainerStarted","Data":"c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601"} Feb 01 07:11:22 crc kubenswrapper[5127]: I0201 07:11:22.463275 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:22 crc kubenswrapper[5127]: I0201 07:11:22.490969 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.490938501 podStartE2EDuration="2.490938501s" podCreationTimestamp="2026-02-01 07:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:22.483325221 +0000 UTC m=+1432.969227624" watchObservedRunningTime="2026-02-01 07:11:22.490938501 +0000 UTC m=+1432.976840904" Feb 01 07:11:23 crc kubenswrapper[5127]: E0201 07:11:23.593462 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:11:23 crc kubenswrapper[5127]: E0201 07:11:23.596369 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:11:23 crc kubenswrapper[5127]: E0201 07:11:23.599108 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:11:23 crc kubenswrapper[5127]: E0201 07:11:23.599202 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4a833731-c214-46d5-966d-ff87d90a6501" containerName="nova-scheduler-scheduler" Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.485655 5127 generic.go:334] "Generic (PLEG): container finished" podID="4a833731-c214-46d5-966d-ff87d90a6501" containerID="e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928" exitCode=0 Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.485753 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a833731-c214-46d5-966d-ff87d90a6501","Type":"ContainerDied","Data":"e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928"} Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.608087 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.738442 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-config-data\") pod \"4a833731-c214-46d5-966d-ff87d90a6501\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.738634 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-combined-ca-bundle\") pod \"4a833731-c214-46d5-966d-ff87d90a6501\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.738665 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwrcl\" (UniqueName: \"kubernetes.io/projected/4a833731-c214-46d5-966d-ff87d90a6501-kube-api-access-dwrcl\") pod \"4a833731-c214-46d5-966d-ff87d90a6501\" (UID: \"4a833731-c214-46d5-966d-ff87d90a6501\") " Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.750225 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a833731-c214-46d5-966d-ff87d90a6501-kube-api-access-dwrcl" (OuterVolumeSpecName: "kube-api-access-dwrcl") pod "4a833731-c214-46d5-966d-ff87d90a6501" (UID: "4a833731-c214-46d5-966d-ff87d90a6501"). InnerVolumeSpecName "kube-api-access-dwrcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.768009 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a833731-c214-46d5-966d-ff87d90a6501" (UID: "4a833731-c214-46d5-966d-ff87d90a6501"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.779364 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-config-data" (OuterVolumeSpecName: "config-data") pod "4a833731-c214-46d5-966d-ff87d90a6501" (UID: "4a833731-c214-46d5-966d-ff87d90a6501"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.841289 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.841340 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a833731-c214-46d5-966d-ff87d90a6501-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:24 crc kubenswrapper[5127]: I0201 07:11:24.841368 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwrcl\" (UniqueName: \"kubernetes.io/projected/4a833731-c214-46d5-966d-ff87d90a6501-kube-api-access-dwrcl\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.488049 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.499478 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a833731-c214-46d5-966d-ff87d90a6501","Type":"ContainerDied","Data":"7a54d5e6c701ecfffde051246dcebda3a0dea9bfbd8ec2f2c6d480379be2f5b4"} Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.499539 5127 scope.go:117] "RemoveContainer" containerID="e6ac5652d8659d4cfd8c4dfdc10bdb33000b671107ee89df539df982ebd2f928" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.499492 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.502242 5127 generic.go:334] "Generic (PLEG): container finished" podID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerID="423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4" exitCode=0 Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.502278 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2744a6-29be-47f0-bd03-fb8164b9f47a","Type":"ContainerDied","Data":"423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4"} Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.502302 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce2744a6-29be-47f0-bd03-fb8164b9f47a","Type":"ContainerDied","Data":"8c4fcbf25f45ddeb42f36ccc4f20b1a9d4c9728db9c136b9ea981107b88aa552"} Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.502339 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.528935 5127 scope.go:117] "RemoveContainer" containerID="423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.564877 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.568089 5127 scope.go:117] "RemoveContainer" containerID="65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.588919 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.598084 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: E0201 07:11:25.598936 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a833731-c214-46d5-966d-ff87d90a6501" containerName="nova-scheduler-scheduler" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.598961 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a833731-c214-46d5-966d-ff87d90a6501" containerName="nova-scheduler-scheduler" Feb 01 07:11:25 crc kubenswrapper[5127]: E0201 07:11:25.599012 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-log" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.599022 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-log" Feb 01 07:11:25 crc kubenswrapper[5127]: E0201 07:11:25.599063 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-api" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.599071 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-api" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.599379 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a833731-c214-46d5-966d-ff87d90a6501" containerName="nova-scheduler-scheduler" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.599418 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-api" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.599443 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" containerName="nova-api-log" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.600486 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.602526 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.607121 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.607325 5127 scope.go:117] "RemoveContainer" containerID="423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4" Feb 01 07:11:25 crc kubenswrapper[5127]: E0201 07:11:25.609184 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4\": container with ID starting with 423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4 not found: ID does not exist" containerID="423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.609229 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4"} err="failed to get container status \"423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4\": rpc error: code = NotFound desc = could not find container \"423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4\": container with ID starting with 423903be285283ed6b0d376df23a03e55dff4c8ee3659b8dd198c181420d49a4 not found: ID does not exist" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.609256 5127 scope.go:117] "RemoveContainer" containerID="65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79" Feb 01 07:11:25 crc kubenswrapper[5127]: E0201 07:11:25.610768 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79\": container with ID starting with 65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79 not found: ID does not exist" containerID="65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.610803 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79"} err="failed to get container status \"65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79\": rpc error: code = NotFound desc = could not find container \"65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79\": container with ID starting with 65c9590191a2c1917b72b4a8d07810245d7dc00f574b409d34f0face202aca79 not found: ID does not exist" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.673545 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2744a6-29be-47f0-bd03-fb8164b9f47a-logs\") pod \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.673659 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-config-data\") pod \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.673820 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-combined-ca-bundle\") pod \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.673898 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x854z\" (UniqueName: \"kubernetes.io/projected/ce2744a6-29be-47f0-bd03-fb8164b9f47a-kube-api-access-x854z\") pod \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\" (UID: \"ce2744a6-29be-47f0-bd03-fb8164b9f47a\") " Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.674019 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2744a6-29be-47f0-bd03-fb8164b9f47a-logs" (OuterVolumeSpecName: "logs") pod "ce2744a6-29be-47f0-bd03-fb8164b9f47a" (UID: "ce2744a6-29be-47f0-bd03-fb8164b9f47a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.674646 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2744a6-29be-47f0-bd03-fb8164b9f47a-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.695899 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2744a6-29be-47f0-bd03-fb8164b9f47a-kube-api-access-x854z" (OuterVolumeSpecName: "kube-api-access-x854z") pod "ce2744a6-29be-47f0-bd03-fb8164b9f47a" (UID: "ce2744a6-29be-47f0-bd03-fb8164b9f47a"). InnerVolumeSpecName "kube-api-access-x854z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.707210 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-config-data" (OuterVolumeSpecName: "config-data") pod "ce2744a6-29be-47f0-bd03-fb8164b9f47a" (UID: "ce2744a6-29be-47f0-bd03-fb8164b9f47a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.707298 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce2744a6-29be-47f0-bd03-fb8164b9f47a" (UID: "ce2744a6-29be-47f0-bd03-fb8164b9f47a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.776097 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-config-data\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.776285 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlhcr\" (UniqueName: \"kubernetes.io/projected/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-kube-api-access-jlhcr\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.776380 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.776443 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x854z\" (UniqueName: \"kubernetes.io/projected/ce2744a6-29be-47f0-bd03-fb8164b9f47a-kube-api-access-x854z\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.776525 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.776538 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2744a6-29be-47f0-bd03-fb8164b9f47a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.844847 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.856238 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.871318 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.872733 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.876473 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.877964 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-config-data\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.878088 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlhcr\" (UniqueName: \"kubernetes.io/projected/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-kube-api-access-jlhcr\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.878175 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.883273 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.883366 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-config-data\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.888899 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.900197 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlhcr\" (UniqueName: \"kubernetes.io/projected/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-kube-api-access-jlhcr\") pod \"nova-scheduler-0\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.914604 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.981026 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.981089 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f7aae5-0170-4c68-8bac-8272ce4cef12-logs\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.981136 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-config-data\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:25 crc kubenswrapper[5127]: I0201 07:11:25.981217 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv84d\" (UniqueName: \"kubernetes.io/projected/56f7aae5-0170-4c68-8bac-8272ce4cef12-kube-api-access-tv84d\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.082870 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv84d\" (UniqueName: \"kubernetes.io/projected/56f7aae5-0170-4c68-8bac-8272ce4cef12-kube-api-access-tv84d\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.083504 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.083534 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f7aae5-0170-4c68-8bac-8272ce4cef12-logs\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.083656 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-config-data\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.084309 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f7aae5-0170-4c68-8bac-8272ce4cef12-logs\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.092689 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-config-data\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.093544 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.098591 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv84d\" (UniqueName: \"kubernetes.io/projected/56f7aae5-0170-4c68-8bac-8272ce4cef12-kube-api-access-tv84d\") pod \"nova-api-0\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.190464 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.255341 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a833731-c214-46d5-966d-ff87d90a6501" path="/var/lib/kubelet/pods/4a833731-c214-46d5-966d-ff87d90a6501/volumes" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.256910 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2744a6-29be-47f0-bd03-fb8164b9f47a" path="/var/lib/kubelet/pods/ce2744a6-29be-47f0-bd03-fb8164b9f47a/volumes" Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.387523 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:11:26 crc kubenswrapper[5127]: W0201 07:11:26.391782 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f9738e6_08e9_42a5_b2c9_1b9f9d944239.slice/crio-25dc2d27260112cda63cd9aff283cd2b5346fec66b4c2549e53b1b80a266369c WatchSource:0}: Error finding container 25dc2d27260112cda63cd9aff283cd2b5346fec66b4c2549e53b1b80a266369c: Status 404 returned error can't find the container with id 25dc2d27260112cda63cd9aff283cd2b5346fec66b4c2549e53b1b80a266369c Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.515527 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f9738e6-08e9-42a5-b2c9-1b9f9d944239","Type":"ContainerStarted","Data":"25dc2d27260112cda63cd9aff283cd2b5346fec66b4c2549e53b1b80a266369c"} Feb 01 07:11:26 crc kubenswrapper[5127]: I0201 07:11:26.702404 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:27 crc kubenswrapper[5127]: I0201 07:11:27.530801 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56f7aae5-0170-4c68-8bac-8272ce4cef12","Type":"ContainerStarted","Data":"74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff"} Feb 01 07:11:27 crc kubenswrapper[5127]: I0201 07:11:27.531331 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56f7aae5-0170-4c68-8bac-8272ce4cef12","Type":"ContainerStarted","Data":"7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf"} Feb 01 07:11:27 crc kubenswrapper[5127]: I0201 07:11:27.531358 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56f7aae5-0170-4c68-8bac-8272ce4cef12","Type":"ContainerStarted","Data":"4aa48ff0d3bfac9d9a42b4b70a134fec75ca08c77b6c35febb9a5fd8f7b2aae2"} Feb 01 07:11:27 crc kubenswrapper[5127]: I0201 07:11:27.536116 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f9738e6-08e9-42a5-b2c9-1b9f9d944239","Type":"ContainerStarted","Data":"38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c"} Feb 01 07:11:27 crc kubenswrapper[5127]: I0201 07:11:27.565000 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.564967164 podStartE2EDuration="2.564967164s" podCreationTimestamp="2026-02-01 07:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:27.55613567 +0000 UTC m=+1438.042038073" watchObservedRunningTime="2026-02-01 07:11:27.564967164 +0000 UTC m=+1438.050869557" Feb 01 07:11:27 crc kubenswrapper[5127]: I0201 07:11:27.577845 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.577820552 podStartE2EDuration="2.577820552s" podCreationTimestamp="2026-02-01 07:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:27.572886171 +0000 UTC m=+1438.058788534" watchObservedRunningTime="2026-02-01 07:11:27.577820552 +0000 UTC m=+1438.063722955" Feb 01 07:11:30 crc kubenswrapper[5127]: I0201 07:11:30.856699 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 01 07:11:30 crc kubenswrapper[5127]: I0201 07:11:30.915244 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 07:11:35 crc kubenswrapper[5127]: I0201 07:11:35.915639 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 07:11:35 crc kubenswrapper[5127]: I0201 07:11:35.961550 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 07:11:36 crc kubenswrapper[5127]: I0201 07:11:36.191274 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:11:36 crc kubenswrapper[5127]: I0201 07:11:36.191685 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:11:36 crc kubenswrapper[5127]: I0201 07:11:36.688468 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 07:11:36 crc kubenswrapper[5127]: I0201 07:11:36.740365 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:11:36 crc kubenswrapper[5127]: I0201 07:11:36.740437 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:11:37 crc kubenswrapper[5127]: I0201 07:11:37.275767 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 07:11:37 crc kubenswrapper[5127]: I0201 07:11:37.275851 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 07:11:37 crc kubenswrapper[5127]: I0201 07:11:37.566654 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.710091 5127 generic.go:334] "Generic (PLEG): container finished" podID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerID="4f25c38fc76a3e6c684ac3c80d888d8b615e7c6b31a8e4ff08ed1e377cd941ee" exitCode=137 Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.710154 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f01359bc-1b04-4ee2-9d82-1b2cb5d69560","Type":"ContainerDied","Data":"4f25c38fc76a3e6c684ac3c80d888d8b615e7c6b31a8e4ff08ed1e377cd941ee"} Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.712875 5127 generic.go:334] "Generic (PLEG): container finished" podID="30fba91a-8a9d-43cd-9d35-5355db347855" containerID="21dac6cdee8fcbf4721f8b5699192ce6257027831a12acb80385cdf434ea81b6" exitCode=137 Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.712892 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"30fba91a-8a9d-43cd-9d35-5355db347855","Type":"ContainerDied","Data":"21dac6cdee8fcbf4721f8b5699192ce6257027831a12acb80385cdf434ea81b6"} Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.912263 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.918937 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.974707 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-logs\") pod \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.974756 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqdb7\" (UniqueName: \"kubernetes.io/projected/30fba91a-8a9d-43cd-9d35-5355db347855-kube-api-access-sqdb7\") pod \"30fba91a-8a9d-43cd-9d35-5355db347855\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.974825 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-combined-ca-bundle\") pod \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.974895 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-config-data\") pod \"30fba91a-8a9d-43cd-9d35-5355db347855\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.974974 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-combined-ca-bundle\") pod \"30fba91a-8a9d-43cd-9d35-5355db347855\" (UID: \"30fba91a-8a9d-43cd-9d35-5355db347855\") " Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.974995 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-config-data\") pod \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.975015 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8d4p\" (UniqueName: \"kubernetes.io/projected/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-kube-api-access-m8d4p\") pod \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\" (UID: \"f01359bc-1b04-4ee2-9d82-1b2cb5d69560\") " Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.975050 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-logs" (OuterVolumeSpecName: "logs") pod "f01359bc-1b04-4ee2-9d82-1b2cb5d69560" (UID: "f01359bc-1b04-4ee2-9d82-1b2cb5d69560"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.975411 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.981234 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-kube-api-access-m8d4p" (OuterVolumeSpecName: "kube-api-access-m8d4p") pod "f01359bc-1b04-4ee2-9d82-1b2cb5d69560" (UID: "f01359bc-1b04-4ee2-9d82-1b2cb5d69560"). InnerVolumeSpecName "kube-api-access-m8d4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:43 crc kubenswrapper[5127]: I0201 07:11:43.981447 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30fba91a-8a9d-43cd-9d35-5355db347855-kube-api-access-sqdb7" (OuterVolumeSpecName: "kube-api-access-sqdb7") pod "30fba91a-8a9d-43cd-9d35-5355db347855" (UID: "30fba91a-8a9d-43cd-9d35-5355db347855"). InnerVolumeSpecName "kube-api-access-sqdb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.001551 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f01359bc-1b04-4ee2-9d82-1b2cb5d69560" (UID: "f01359bc-1b04-4ee2-9d82-1b2cb5d69560"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.005029 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30fba91a-8a9d-43cd-9d35-5355db347855" (UID: "30fba91a-8a9d-43cd-9d35-5355db347855"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.016183 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-config-data" (OuterVolumeSpecName: "config-data") pod "30fba91a-8a9d-43cd-9d35-5355db347855" (UID: "30fba91a-8a9d-43cd-9d35-5355db347855"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.026333 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-config-data" (OuterVolumeSpecName: "config-data") pod "f01359bc-1b04-4ee2-9d82-1b2cb5d69560" (UID: "f01359bc-1b04-4ee2-9d82-1b2cb5d69560"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.077798 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.077876 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fba91a-8a9d-43cd-9d35-5355db347855-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.077894 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.077903 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8d4p\" (UniqueName: \"kubernetes.io/projected/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-kube-api-access-m8d4p\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.077916 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqdb7\" (UniqueName: \"kubernetes.io/projected/30fba91a-8a9d-43cd-9d35-5355db347855-kube-api-access-sqdb7\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.077948 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f01359bc-1b04-4ee2-9d82-1b2cb5d69560-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.727768 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f01359bc-1b04-4ee2-9d82-1b2cb5d69560","Type":"ContainerDied","Data":"1725824f0685ea1cc81c4e2460822a7a0a3e952c57c4406955f2260eb51a2240"} Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.727817 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.727884 5127 scope.go:117] "RemoveContainer" containerID="4f25c38fc76a3e6c684ac3c80d888d8b615e7c6b31a8e4ff08ed1e377cd941ee" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.731396 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"30fba91a-8a9d-43cd-9d35-5355db347855","Type":"ContainerDied","Data":"753cd86846d32cae6e27d5b5c68366ab63cd79feeca9d8045c03b0977928555f"} Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.731532 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.774415 5127 scope.go:117] "RemoveContainer" containerID="cb86fe852f05b7a0fca92ae79cf48f3636bdb610ba6973e39a8b87ac41355a27" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.774909 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.798699 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.807735 5127 scope.go:117] "RemoveContainer" containerID="21dac6cdee8fcbf4721f8b5699192ce6257027831a12acb80385cdf434ea81b6" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.814027 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.829118 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.842238 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: E0201 07:11:44.843079 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-metadata" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.843108 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-metadata" Feb 01 07:11:44 crc kubenswrapper[5127]: E0201 07:11:44.843146 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-log" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.843157 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-log" Feb 01 07:11:44 crc kubenswrapper[5127]: E0201 07:11:44.843177 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fba91a-8a9d-43cd-9d35-5355db347855" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.843188 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fba91a-8a9d-43cd-9d35-5355db347855" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.843450 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-metadata" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.843480 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="30fba91a-8a9d-43cd-9d35-5355db347855" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.843503 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" containerName="nova-metadata-log" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.845089 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.849408 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.849825 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.853694 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.857116 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.859701 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.859792 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.859895 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.862990 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.879800 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.894845 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895091 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895214 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895294 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895408 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89qqq\" (UniqueName: \"kubernetes.io/projected/07697040-f619-40c4-af39-0959e02a1db8-kube-api-access-89qqq\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895477 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-config-data\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895544 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07697040-f619-40c4-af39-0959e02a1db8-logs\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895749 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d76r8\" (UniqueName: \"kubernetes.io/projected/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-kube-api-access-d76r8\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895845 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.895932 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.997278 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.997620 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.997656 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998011 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998129 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89qqq\" (UniqueName: \"kubernetes.io/projected/07697040-f619-40c4-af39-0959e02a1db8-kube-api-access-89qqq\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998150 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-config-data\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998166 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07697040-f619-40c4-af39-0959e02a1db8-logs\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998489 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d76r8\" (UniqueName: \"kubernetes.io/projected/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-kube-api-access-d76r8\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998515 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998630 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07697040-f619-40c4-af39-0959e02a1db8-logs\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:44 crc kubenswrapper[5127]: I0201 07:11:44.998720 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.003073 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.004260 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.004785 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.005218 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.006192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.007477 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-config-data\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.008193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.019704 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89qqq\" (UniqueName: \"kubernetes.io/projected/07697040-f619-40c4-af39-0959e02a1db8-kube-api-access-89qqq\") pod \"nova-metadata-0\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " pod="openstack/nova-metadata-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.022209 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d76r8\" (UniqueName: \"kubernetes.io/projected/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-kube-api-access-d76r8\") pod \"nova-cell1-novncproxy-0\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.173052 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.186285 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.504441 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.743288 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07697040-f619-40c4-af39-0959e02a1db8","Type":"ContainerStarted","Data":"55e02135dd6d48cfacc2b665ad5b536b2ee48bdd18d3d3a2254845cf426732ca"} Feb 01 07:11:45 crc kubenswrapper[5127]: I0201 07:11:45.774174 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.199165 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.199841 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.199967 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.250008 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fba91a-8a9d-43cd-9d35-5355db347855" path="/var/lib/kubelet/pods/30fba91a-8a9d-43cd-9d35-5355db347855/volumes" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.250779 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f01359bc-1b04-4ee2-9d82-1b2cb5d69560" path="/var/lib/kubelet/pods/f01359bc-1b04-4ee2-9d82-1b2cb5d69560/volumes" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.382208 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.760139 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07697040-f619-40c4-af39-0959e02a1db8","Type":"ContainerStarted","Data":"5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045"} Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.760203 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07697040-f619-40c4-af39-0959e02a1db8","Type":"ContainerStarted","Data":"ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d"} Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.763026 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e","Type":"ContainerStarted","Data":"59ccaa8d2bba84519b6c4dd0057f50e45fb9b19f2e881b045bcfc6bbe203275b"} Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.763258 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.763408 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e","Type":"ContainerStarted","Data":"9f440cf398021049af247233fa14d7fb089fd75c537c5c0c29d78cbac3be31f0"} Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.767616 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.823107 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.821571098 podStartE2EDuration="2.821571098s" podCreationTimestamp="2026-02-01 07:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:46.787725965 +0000 UTC m=+1457.273628418" watchObservedRunningTime="2026-02-01 07:11:46.821571098 +0000 UTC m=+1457.307473501" Feb 01 07:11:46 crc kubenswrapper[5127]: I0201 07:11:46.851491 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.851471615 podStartE2EDuration="2.851471615s" podCreationTimestamp="2026-02-01 07:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:46.812922089 +0000 UTC m=+1457.298824472" watchObservedRunningTime="2026-02-01 07:11:46.851471615 +0000 UTC m=+1457.337373988" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.004883 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-fhj5m"] Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.006572 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.068648 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-fhj5m"] Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.143220 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.143528 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-svc\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.143672 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-config\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.143844 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.143950 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.144088 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9xgp\" (UniqueName: \"kubernetes.io/projected/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-kube-api-access-t9xgp\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.245702 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.245766 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-svc\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.245815 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-config\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.245894 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.245917 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.245973 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9xgp\" (UniqueName: \"kubernetes.io/projected/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-kube-api-access-t9xgp\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.246869 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.247520 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-config\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.247650 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.247754 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.249413 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-svc\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.270025 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9xgp\" (UniqueName: \"kubernetes.io/projected/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-kube-api-access-t9xgp\") pod \"dnsmasq-dns-5ddd577785-fhj5m\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.325078 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:47 crc kubenswrapper[5127]: I0201 07:11:47.820320 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-fhj5m"] Feb 01 07:11:48 crc kubenswrapper[5127]: I0201 07:11:48.781271 5127 generic.go:334] "Generic (PLEG): container finished" podID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerID="0e8a3a74180f02556c4a75f8eb7281666b800f01ef25e92d264a64ed5ddcb187" exitCode=0 Feb 01 07:11:48 crc kubenswrapper[5127]: I0201 07:11:48.781329 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" event={"ID":"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a","Type":"ContainerDied","Data":"0e8a3a74180f02556c4a75f8eb7281666b800f01ef25e92d264a64ed5ddcb187"} Feb 01 07:11:48 crc kubenswrapper[5127]: I0201 07:11:48.781622 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" event={"ID":"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a","Type":"ContainerStarted","Data":"89799b9f4e14929be8b177e15bc077f3232094ccf9f7f0fd4cc2b3fc7c05cb79"} Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.198770 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.199388 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-central-agent" containerID="cri-o://3a2a2c655dd29b2aebae70b7233ae2df0528dd625d03162ba4c1c898097310b8" gracePeriod=30 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.199661 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="proxy-httpd" containerID="cri-o://071759326f699e0c8d2cc65ebcb098be6183ebea05028fcaa8cfb94bb6f1402b" gracePeriod=30 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.199709 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="sg-core" containerID="cri-o://efcbb319cd668430dd4c8ecca3d58d47f9cf2aaaaa4b1ac43810a468a87b7cbd" gracePeriod=30 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.199750 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-notification-agent" containerID="cri-o://3e5013ec87f4223b60ab710d38021572bf1e511c76300d29da5408c9c4b53a8b" gracePeriod=30 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.402903 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.793374 5127 generic.go:334] "Generic (PLEG): container finished" podID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerID="071759326f699e0c8d2cc65ebcb098be6183ebea05028fcaa8cfb94bb6f1402b" exitCode=0 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.793941 5127 generic.go:334] "Generic (PLEG): container finished" podID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerID="efcbb319cd668430dd4c8ecca3d58d47f9cf2aaaaa4b1ac43810a468a87b7cbd" exitCode=2 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.794035 5127 generic.go:334] "Generic (PLEG): container finished" podID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerID="3a2a2c655dd29b2aebae70b7233ae2df0528dd625d03162ba4c1c898097310b8" exitCode=0 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.793462 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerDied","Data":"071759326f699e0c8d2cc65ebcb098be6183ebea05028fcaa8cfb94bb6f1402b"} Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.794328 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerDied","Data":"efcbb319cd668430dd4c8ecca3d58d47f9cf2aaaaa4b1ac43810a468a87b7cbd"} Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.794426 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerDied","Data":"3a2a2c655dd29b2aebae70b7233ae2df0528dd625d03162ba4c1c898097310b8"} Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.797474 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-log" containerID="cri-o://7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf" gracePeriod=30 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.798106 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" event={"ID":"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a","Type":"ContainerStarted","Data":"f162bb848cfee5be37c6d67f9f232905d8e7c65a774425c0a49d943f58e74593"} Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.798544 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-api" containerID="cri-o://74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff" gracePeriod=30 Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.798716 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:49 crc kubenswrapper[5127]: I0201 07:11:49.829565 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" podStartSLOduration=3.8295470480000002 podStartE2EDuration="3.829547048s" podCreationTimestamp="2026-02-01 07:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:49.820628913 +0000 UTC m=+1460.306531296" watchObservedRunningTime="2026-02-01 07:11:49.829547048 +0000 UTC m=+1460.315449421" Feb 01 07:11:50 crc kubenswrapper[5127]: I0201 07:11:50.173281 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:11:50 crc kubenswrapper[5127]: I0201 07:11:50.173342 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:11:50 crc kubenswrapper[5127]: I0201 07:11:50.186809 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:50 crc kubenswrapper[5127]: I0201 07:11:50.810019 5127 generic.go:334] "Generic (PLEG): container finished" podID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerID="7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf" exitCode=143 Feb 01 07:11:50 crc kubenswrapper[5127]: I0201 07:11:50.810429 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56f7aae5-0170-4c68-8bac-8272ce4cef12","Type":"ContainerDied","Data":"7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf"} Feb 01 07:11:52 crc kubenswrapper[5127]: I0201 07:11:52.843941 5127 generic.go:334] "Generic (PLEG): container finished" podID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerID="3e5013ec87f4223b60ab710d38021572bf1e511c76300d29da5408c9c4b53a8b" exitCode=0 Feb 01 07:11:52 crc kubenswrapper[5127]: I0201 07:11:52.844532 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerDied","Data":"3e5013ec87f4223b60ab710d38021572bf1e511c76300d29da5408c9c4b53a8b"} Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.073539 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.259554 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-combined-ca-bundle\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.260004 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-run-httpd\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.260114 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-log-httpd\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.260169 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wn4z\" (UniqueName: \"kubernetes.io/projected/70bc113f-723b-4328-9e57-6be9ae93b5cb-kube-api-access-8wn4z\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.260239 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-scripts\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.260286 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-ceilometer-tls-certs\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.260341 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-sg-core-conf-yaml\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.260361 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-config-data\") pod \"70bc113f-723b-4328-9e57-6be9ae93b5cb\" (UID: \"70bc113f-723b-4328-9e57-6be9ae93b5cb\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.261137 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.261167 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.267058 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-scripts" (OuterVolumeSpecName: "scripts") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.267095 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70bc113f-723b-4328-9e57-6be9ae93b5cb-kube-api-access-8wn4z" (OuterVolumeSpecName: "kube-api-access-8wn4z") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "kube-api-access-8wn4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.267241 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.340830 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.351927 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.361703 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv84d\" (UniqueName: \"kubernetes.io/projected/56f7aae5-0170-4c68-8bac-8272ce4cef12-kube-api-access-tv84d\") pod \"56f7aae5-0170-4c68-8bac-8272ce4cef12\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.361865 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f7aae5-0170-4c68-8bac-8272ce4cef12-logs\") pod \"56f7aae5-0170-4c68-8bac-8272ce4cef12\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362016 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-config-data\") pod \"56f7aae5-0170-4c68-8bac-8272ce4cef12\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362191 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-combined-ca-bundle\") pod \"56f7aae5-0170-4c68-8bac-8272ce4cef12\" (UID: \"56f7aae5-0170-4c68-8bac-8272ce4cef12\") " Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362396 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f7aae5-0170-4c68-8bac-8272ce4cef12-logs" (OuterVolumeSpecName: "logs") pod "56f7aae5-0170-4c68-8bac-8272ce4cef12" (UID: "56f7aae5-0170-4c68-8bac-8272ce4cef12"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362877 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362905 5127 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362918 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362931 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56f7aae5-0170-4c68-8bac-8272ce4cef12-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362944 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362955 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/70bc113f-723b-4328-9e57-6be9ae93b5cb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.362968 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wn4z\" (UniqueName: \"kubernetes.io/projected/70bc113f-723b-4328-9e57-6be9ae93b5cb-kube-api-access-8wn4z\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.365218 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f7aae5-0170-4c68-8bac-8272ce4cef12-kube-api-access-tv84d" (OuterVolumeSpecName: "kube-api-access-tv84d") pod "56f7aae5-0170-4c68-8bac-8272ce4cef12" (UID: "56f7aae5-0170-4c68-8bac-8272ce4cef12"). InnerVolumeSpecName "kube-api-access-tv84d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.396645 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-config-data" (OuterVolumeSpecName: "config-data") pod "56f7aae5-0170-4c68-8bac-8272ce4cef12" (UID: "56f7aae5-0170-4c68-8bac-8272ce4cef12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.403535 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.415136 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56f7aae5-0170-4c68-8bac-8272ce4cef12" (UID: "56f7aae5-0170-4c68-8bac-8272ce4cef12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.424837 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-config-data" (OuterVolumeSpecName: "config-data") pod "70bc113f-723b-4328-9e57-6be9ae93b5cb" (UID: "70bc113f-723b-4328-9e57-6be9ae93b5cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.463766 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.463802 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv84d\" (UniqueName: \"kubernetes.io/projected/56f7aae5-0170-4c68-8bac-8272ce4cef12-kube-api-access-tv84d\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.463817 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.463829 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70bc113f-723b-4328-9e57-6be9ae93b5cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.463843 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56f7aae5-0170-4c68-8bac-8272ce4cef12-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.854243 5127 generic.go:334] "Generic (PLEG): container finished" podID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerID="74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff" exitCode=0 Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.854325 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56f7aae5-0170-4c68-8bac-8272ce4cef12","Type":"ContainerDied","Data":"74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff"} Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.854356 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56f7aae5-0170-4c68-8bac-8272ce4cef12","Type":"ContainerDied","Data":"4aa48ff0d3bfac9d9a42b4b70a134fec75ca08c77b6c35febb9a5fd8f7b2aae2"} Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.854373 5127 scope.go:117] "RemoveContainer" containerID="74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.854535 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.858794 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"70bc113f-723b-4328-9e57-6be9ae93b5cb","Type":"ContainerDied","Data":"d884b3cd2fb8974fd02b7b89b10d060c906b1dbbb35b309116f7414ff1f821a4"} Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.858941 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.903009 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.914233 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.917910 5127 scope.go:117] "RemoveContainer" containerID="7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.928731 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.941983 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.942537 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="sg-core" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942563 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="sg-core" Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.942595 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-api" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942603 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-api" Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.942625 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-notification-agent" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942634 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-notification-agent" Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.942653 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-log" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942661 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-log" Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.942672 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-central-agent" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942679 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-central-agent" Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.942699 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="proxy-httpd" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942707 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="proxy-httpd" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942915 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-notification-agent" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942939 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-api" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942959 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="proxy-httpd" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942968 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="sg-core" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942982 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" containerName="ceilometer-central-agent" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.942992 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" containerName="nova-api-log" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.944002 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.944148 5127 scope.go:117] "RemoveContainer" containerID="74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.948888 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.949078 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff\": container with ID starting with 74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff not found: ID does not exist" containerID="74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.949129 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff"} err="failed to get container status \"74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff\": rpc error: code = NotFound desc = could not find container \"74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff\": container with ID starting with 74c2d70f4ee1b80b2005595f7abf96be4f5d1057e05c6521f529120f516ff9ff not found: ID does not exist" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.949163 5127 scope.go:117] "RemoveContainer" containerID="7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.949221 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 07:11:53 crc kubenswrapper[5127]: E0201 07:11:53.950749 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf\": container with ID starting with 7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf not found: ID does not exist" containerID="7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.950788 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf"} err="failed to get container status \"7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf\": rpc error: code = NotFound desc = could not find container \"7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf\": container with ID starting with 7ca8901ec1cdf4c747b5fa7d4e2e2300d6690c1106788db46f50d3da91e8a1bf not found: ID does not exist" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.950808 5127 scope.go:117] "RemoveContainer" containerID="071759326f699e0c8d2cc65ebcb098be6183ebea05028fcaa8cfb94bb6f1402b" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.962940 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.975152 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/779a99b7-5e9d-4399-aef7-bdc09971b060-logs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.975193 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.975221 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv6px\" (UniqueName: \"kubernetes.io/projected/779a99b7-5e9d-4399-aef7-bdc09971b060-kube-api-access-xv6px\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.975249 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-internal-tls-certs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.975271 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-config-data\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.975338 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-public-tls-certs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.979391 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:53 crc kubenswrapper[5127]: I0201 07:11:53.987362 5127 scope.go:117] "RemoveContainer" containerID="efcbb319cd668430dd4c8ecca3d58d47f9cf2aaaaa4b1ac43810a468a87b7cbd" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.000413 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.009217 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.011180 5127 scope.go:117] "RemoveContainer" containerID="3e5013ec87f4223b60ab710d38021572bf1e511c76300d29da5408c9c4b53a8b" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.011468 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.016270 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.016273 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.016352 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.021928 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.039011 5127 scope.go:117] "RemoveContainer" containerID="3a2a2c655dd29b2aebae70b7233ae2df0528dd625d03162ba4c1c898097310b8" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-public-tls-certs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077684 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077711 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-log-httpd\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077769 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zx4g\" (UniqueName: \"kubernetes.io/projected/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-kube-api-access-6zx4g\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077790 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/779a99b7-5e9d-4399-aef7-bdc09971b060-logs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077872 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077944 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv6px\" (UniqueName: \"kubernetes.io/projected/779a99b7-5e9d-4399-aef7-bdc09971b060-kube-api-access-xv6px\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.077977 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-scripts\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.078031 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-internal-tls-certs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.078083 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-config-data\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.078102 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/779a99b7-5e9d-4399-aef7-bdc09971b060-logs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.078117 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-run-httpd\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.078785 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.078895 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-config-data\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.078997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.081293 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-public-tls-certs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.082553 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-config-data\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.085125 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-internal-tls-certs\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.087141 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.098460 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv6px\" (UniqueName: \"kubernetes.io/projected/779a99b7-5e9d-4399-aef7-bdc09971b060-kube-api-access-xv6px\") pod \"nova-api-0\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181039 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181094 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-log-httpd\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181143 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zx4g\" (UniqueName: \"kubernetes.io/projected/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-kube-api-access-6zx4g\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181209 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-scripts\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181282 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-run-httpd\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181318 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181358 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-config-data\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.181385 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.182315 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-run-httpd\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.182716 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-log-httpd\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.185237 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.185331 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.186047 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-scripts\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.187514 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-config-data\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.207362 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.210602 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zx4g\" (UniqueName: \"kubernetes.io/projected/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-kube-api-access-6zx4g\") pod \"ceilometer-0\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.252439 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f7aae5-0170-4c68-8bac-8272ce4cef12" path="/var/lib/kubelet/pods/56f7aae5-0170-4c68-8bac-8272ce4cef12/volumes" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.253176 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70bc113f-723b-4328-9e57-6be9ae93b5cb" path="/var/lib/kubelet/pods/70bc113f-723b-4328-9e57-6be9ae93b5cb/volumes" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.277182 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.326176 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.730409 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:11:54 crc kubenswrapper[5127]: W0201 07:11:54.748465 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod779a99b7_5e9d_4399_aef7_bdc09971b060.slice/crio-5c4b442a0ce6abf80b07b65392b79e92675ae61d96305406c7fb515c7805fdf4 WatchSource:0}: Error finding container 5c4b442a0ce6abf80b07b65392b79e92675ae61d96305406c7fb515c7805fdf4: Status 404 returned error can't find the container with id 5c4b442a0ce6abf80b07b65392b79e92675ae61d96305406c7fb515c7805fdf4 Feb 01 07:11:54 crc kubenswrapper[5127]: W0201 07:11:54.816066 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd79bb5e5_a6e6_46ee_b04c_bd5249adb8bd.slice/crio-796502bc0eec651a9fbd813406160e910689ed1dfcf89837fb447ba9e8a1eb13 WatchSource:0}: Error finding container 796502bc0eec651a9fbd813406160e910689ed1dfcf89837fb447ba9e8a1eb13: Status 404 returned error can't find the container with id 796502bc0eec651a9fbd813406160e910689ed1dfcf89837fb447ba9e8a1eb13 Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.816224 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.819228 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.878168 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"779a99b7-5e9d-4399-aef7-bdc09971b060","Type":"ContainerStarted","Data":"5c4b442a0ce6abf80b07b65392b79e92675ae61d96305406c7fb515c7805fdf4"} Feb 01 07:11:54 crc kubenswrapper[5127]: I0201 07:11:54.879426 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerStarted","Data":"796502bc0eec651a9fbd813406160e910689ed1dfcf89837fb447ba9e8a1eb13"} Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.173615 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.173942 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.186892 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.213418 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.894068 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"779a99b7-5e9d-4399-aef7-bdc09971b060","Type":"ContainerStarted","Data":"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907"} Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.894358 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"779a99b7-5e9d-4399-aef7-bdc09971b060","Type":"ContainerStarted","Data":"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91"} Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.901653 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerStarted","Data":"1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b"} Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.925307 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:11:55 crc kubenswrapper[5127]: I0201 07:11:55.959094 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9590710319999998 podStartE2EDuration="2.959071032s" podCreationTimestamp="2026-02-01 07:11:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:55.925693162 +0000 UTC m=+1466.411595525" watchObservedRunningTime="2026-02-01 07:11:55.959071032 +0000 UTC m=+1466.444973395" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.116309 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vgtk6"] Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.117806 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.122246 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.122522 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.123265 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-scripts\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.123412 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5stch\" (UniqueName: \"kubernetes.io/projected/21b313db-9404-4f6c-8998-800ea3110fc9-kube-api-access-5stch\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.123555 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.123716 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-config-data\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.145740 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vgtk6"] Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.187525 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.187558 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.225973 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5stch\" (UniqueName: \"kubernetes.io/projected/21b313db-9404-4f6c-8998-800ea3110fc9-kube-api-access-5stch\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.226084 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.226117 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-config-data\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.226168 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-scripts\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.231384 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-config-data\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.232255 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-scripts\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.233280 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.247725 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5stch\" (UniqueName: \"kubernetes.io/projected/21b313db-9404-4f6c-8998-800ea3110fc9-kube-api-access-5stch\") pod \"nova-cell1-cell-mapping-vgtk6\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.445957 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.911099 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerStarted","Data":"c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357"} Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.911748 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerStarted","Data":"271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e"} Feb 01 07:11:56 crc kubenswrapper[5127]: I0201 07:11:56.958661 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vgtk6"] Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.326722 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.423643 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-gxls2"] Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.423899 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" podUID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerName="dnsmasq-dns" containerID="cri-o://7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c" gracePeriod=10 Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.919448 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.919847 5127 generic.go:334] "Generic (PLEG): container finished" podID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerID="7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c" exitCode=0 Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.919922 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" event={"ID":"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f","Type":"ContainerDied","Data":"7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c"} Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.919946 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" event={"ID":"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f","Type":"ContainerDied","Data":"8e00ee46b69eebbe7901c937874c0135e0b9d350989c6d1a3f7ca4cb25757286"} Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.919961 5127 scope.go:117] "RemoveContainer" containerID="7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.921536 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vgtk6" event={"ID":"21b313db-9404-4f6c-8998-800ea3110fc9","Type":"ContainerStarted","Data":"8cf8bfca617699b4935b3ba3f769e3af08c1e668d2cbc37132518f3e8740bc5f"} Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.921561 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vgtk6" event={"ID":"21b313db-9404-4f6c-8998-800ea3110fc9","Type":"ContainerStarted","Data":"9562afd63d006a597505a8762877f6c14f2c2b7204078ee496d665d5cf490b4c"} Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.939975 5127 scope.go:117] "RemoveContainer" containerID="cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.960740 5127 scope.go:117] "RemoveContainer" containerID="7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c" Feb 01 07:11:57 crc kubenswrapper[5127]: E0201 07:11:57.961174 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c\": container with ID starting with 7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c not found: ID does not exist" containerID="7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.961207 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c"} err="failed to get container status \"7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c\": rpc error: code = NotFound desc = could not find container \"7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c\": container with ID starting with 7d621dc70f64813cde209b5eeb54f20dc899405f6d17d20dd0eefea9c631ee4c not found: ID does not exist" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.961231 5127 scope.go:117] "RemoveContainer" containerID="cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17" Feb 01 07:11:57 crc kubenswrapper[5127]: E0201 07:11:57.961502 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17\": container with ID starting with cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17 not found: ID does not exist" containerID="cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.961524 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17"} err="failed to get container status \"cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17\": rpc error: code = NotFound desc = could not find container \"cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17\": container with ID starting with cff01c490562a8acdce94ddca1680e1a12fd94f24bce58e13ff5dea6c3308c17 not found: ID does not exist" Feb 01 07:11:57 crc kubenswrapper[5127]: I0201 07:11:57.973978 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vgtk6" podStartSLOduration=1.973962345 podStartE2EDuration="1.973962345s" podCreationTimestamp="2026-02-01 07:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:11:57.962727619 +0000 UTC m=+1468.448629982" watchObservedRunningTime="2026-02-01 07:11:57.973962345 +0000 UTC m=+1468.459864708" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.057140 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjwjw\" (UniqueName: \"kubernetes.io/projected/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-kube-api-access-bjwjw\") pod \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.057226 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-sb\") pod \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.057244 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-svc\") pod \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.057261 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-config\") pod \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.057331 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-nb\") pod \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.057352 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-swift-storage-0\") pod \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\" (UID: \"f272e6a0-6e5c-4b7b-9118-a3ee4adae73f\") " Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.082785 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-kube-api-access-bjwjw" (OuterVolumeSpecName: "kube-api-access-bjwjw") pod "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" (UID: "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f"). InnerVolumeSpecName "kube-api-access-bjwjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.118011 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-config" (OuterVolumeSpecName: "config") pod "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" (UID: "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.125673 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" (UID: "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.134651 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" (UID: "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.138043 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" (UID: "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.142363 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" (UID: "f272e6a0-6e5c-4b7b-9118-a3ee4adae73f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.158783 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.158815 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.158823 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.158835 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.158843 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.158854 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjwjw\" (UniqueName: \"kubernetes.io/projected/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f-kube-api-access-bjwjw\") on node \"crc\" DevicePath \"\"" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.939283 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-gxls2" Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.971682 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-gxls2"] Feb 01 07:11:58 crc kubenswrapper[5127]: I0201 07:11:58.985454 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-gxls2"] Feb 01 07:11:59 crc kubenswrapper[5127]: I0201 07:11:59.955903 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerStarted","Data":"7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49"} Feb 01 07:11:59 crc kubenswrapper[5127]: I0201 07:11:59.957219 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 07:11:59 crc kubenswrapper[5127]: I0201 07:11:59.997289 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.987779417 podStartE2EDuration="6.997268459s" podCreationTimestamp="2026-02-01 07:11:53 +0000 UTC" firstStartedPulling="2026-02-01 07:11:54.818911288 +0000 UTC m=+1465.304813661" lastFinishedPulling="2026-02-01 07:11:58.82840035 +0000 UTC m=+1469.314302703" observedRunningTime="2026-02-01 07:11:59.985817308 +0000 UTC m=+1470.471719671" watchObservedRunningTime="2026-02-01 07:11:59.997268459 +0000 UTC m=+1470.483170822" Feb 01 07:12:00 crc kubenswrapper[5127]: I0201 07:12:00.246149 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" path="/var/lib/kubelet/pods/f272e6a0-6e5c-4b7b-9118-a3ee4adae73f/volumes" Feb 01 07:12:01 crc kubenswrapper[5127]: I0201 07:12:01.985174 5127 generic.go:334] "Generic (PLEG): container finished" podID="21b313db-9404-4f6c-8998-800ea3110fc9" containerID="8cf8bfca617699b4935b3ba3f769e3af08c1e668d2cbc37132518f3e8740bc5f" exitCode=0 Feb 01 07:12:01 crc kubenswrapper[5127]: I0201 07:12:01.985409 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vgtk6" event={"ID":"21b313db-9404-4f6c-8998-800ea3110fc9","Type":"ContainerDied","Data":"8cf8bfca617699b4935b3ba3f769e3af08c1e668d2cbc37132518f3e8740bc5f"} Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.338311 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.501407 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-config-data\") pod \"21b313db-9404-4f6c-8998-800ea3110fc9\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.501506 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-scripts\") pod \"21b313db-9404-4f6c-8998-800ea3110fc9\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.501565 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5stch\" (UniqueName: \"kubernetes.io/projected/21b313db-9404-4f6c-8998-800ea3110fc9-kube-api-access-5stch\") pod \"21b313db-9404-4f6c-8998-800ea3110fc9\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.501610 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-combined-ca-bundle\") pod \"21b313db-9404-4f6c-8998-800ea3110fc9\" (UID: \"21b313db-9404-4f6c-8998-800ea3110fc9\") " Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.507823 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b313db-9404-4f6c-8998-800ea3110fc9-kube-api-access-5stch" (OuterVolumeSpecName: "kube-api-access-5stch") pod "21b313db-9404-4f6c-8998-800ea3110fc9" (UID: "21b313db-9404-4f6c-8998-800ea3110fc9"). InnerVolumeSpecName "kube-api-access-5stch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.508618 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-scripts" (OuterVolumeSpecName: "scripts") pod "21b313db-9404-4f6c-8998-800ea3110fc9" (UID: "21b313db-9404-4f6c-8998-800ea3110fc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.547861 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-config-data" (OuterVolumeSpecName: "config-data") pod "21b313db-9404-4f6c-8998-800ea3110fc9" (UID: "21b313db-9404-4f6c-8998-800ea3110fc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.552510 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21b313db-9404-4f6c-8998-800ea3110fc9" (UID: "21b313db-9404-4f6c-8998-800ea3110fc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.603761 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.603806 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.603817 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5stch\" (UniqueName: \"kubernetes.io/projected/21b313db-9404-4f6c-8998-800ea3110fc9-kube-api-access-5stch\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:03 crc kubenswrapper[5127]: I0201 07:12:03.603828 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21b313db-9404-4f6c-8998-800ea3110fc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.004087 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vgtk6" event={"ID":"21b313db-9404-4f6c-8998-800ea3110fc9","Type":"ContainerDied","Data":"9562afd63d006a597505a8762877f6c14f2c2b7204078ee496d665d5cf490b4c"} Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.004122 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9562afd63d006a597505a8762877f6c14f2c2b7204078ee496d665d5cf490b4c" Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.004125 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vgtk6" Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.194883 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.195335 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-api" containerID="cri-o://0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907" gracePeriod=30 Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.195218 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-log" containerID="cri-o://f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91" gracePeriod=30 Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.219412 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.219687 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2f9738e6-08e9-42a5-b2c9-1b9f9d944239" containerName="nova-scheduler-scheduler" containerID="cri-o://38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c" gracePeriod=30 Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.250810 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.251457 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-metadata" containerID="cri-o://5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045" gracePeriod=30 Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.251672 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-log" containerID="cri-o://ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d" gracePeriod=30 Feb 01 07:12:04 crc kubenswrapper[5127]: I0201 07:12:04.861714 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.012054 5127 generic.go:334] "Generic (PLEG): container finished" podID="07697040-f619-40c4-af39-0959e02a1db8" containerID="ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d" exitCode=143 Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.012104 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07697040-f619-40c4-af39-0959e02a1db8","Type":"ContainerDied","Data":"ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d"} Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.014388 5127 generic.go:334] "Generic (PLEG): container finished" podID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerID="0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907" exitCode=0 Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.014422 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.014441 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"779a99b7-5e9d-4399-aef7-bdc09971b060","Type":"ContainerDied","Data":"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907"} Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.014472 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"779a99b7-5e9d-4399-aef7-bdc09971b060","Type":"ContainerDied","Data":"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91"} Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.014493 5127 scope.go:117] "RemoveContainer" containerID="0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.014424 5127 generic.go:334] "Generic (PLEG): container finished" podID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerID="f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91" exitCode=143 Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.014627 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"779a99b7-5e9d-4399-aef7-bdc09971b060","Type":"ContainerDied","Data":"5c4b442a0ce6abf80b07b65392b79e92675ae61d96305406c7fb515c7805fdf4"} Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.031075 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-public-tls-certs\") pod \"779a99b7-5e9d-4399-aef7-bdc09971b060\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.031140 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv6px\" (UniqueName: \"kubernetes.io/projected/779a99b7-5e9d-4399-aef7-bdc09971b060-kube-api-access-xv6px\") pod \"779a99b7-5e9d-4399-aef7-bdc09971b060\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.031189 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-config-data\") pod \"779a99b7-5e9d-4399-aef7-bdc09971b060\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.031211 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/779a99b7-5e9d-4399-aef7-bdc09971b060-logs\") pod \"779a99b7-5e9d-4399-aef7-bdc09971b060\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.031228 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-internal-tls-certs\") pod \"779a99b7-5e9d-4399-aef7-bdc09971b060\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.031272 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-combined-ca-bundle\") pod \"779a99b7-5e9d-4399-aef7-bdc09971b060\" (UID: \"779a99b7-5e9d-4399-aef7-bdc09971b060\") " Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.032958 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/779a99b7-5e9d-4399-aef7-bdc09971b060-logs" (OuterVolumeSpecName: "logs") pod "779a99b7-5e9d-4399-aef7-bdc09971b060" (UID: "779a99b7-5e9d-4399-aef7-bdc09971b060"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.036464 5127 scope.go:117] "RemoveContainer" containerID="f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.038087 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/779a99b7-5e9d-4399-aef7-bdc09971b060-kube-api-access-xv6px" (OuterVolumeSpecName: "kube-api-access-xv6px") pod "779a99b7-5e9d-4399-aef7-bdc09971b060" (UID: "779a99b7-5e9d-4399-aef7-bdc09971b060"). InnerVolumeSpecName "kube-api-access-xv6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.068776 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "779a99b7-5e9d-4399-aef7-bdc09971b060" (UID: "779a99b7-5e9d-4399-aef7-bdc09971b060"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.070297 5127 scope.go:117] "RemoveContainer" containerID="0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907" Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.070660 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907\": container with ID starting with 0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907 not found: ID does not exist" containerID="0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.070688 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907"} err="failed to get container status \"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907\": rpc error: code = NotFound desc = could not find container \"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907\": container with ID starting with 0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907 not found: ID does not exist" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.070708 5127 scope.go:117] "RemoveContainer" containerID="f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91" Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.071109 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91\": container with ID starting with f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91 not found: ID does not exist" containerID="f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.071134 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91"} err="failed to get container status \"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91\": rpc error: code = NotFound desc = could not find container \"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91\": container with ID starting with f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91 not found: ID does not exist" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.071150 5127 scope.go:117] "RemoveContainer" containerID="0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.071437 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907"} err="failed to get container status \"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907\": rpc error: code = NotFound desc = could not find container \"0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907\": container with ID starting with 0e026838018b111040151d423a28593dd3ed969eb20770b4d7b314b1311b5907 not found: ID does not exist" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.071511 5127 scope.go:117] "RemoveContainer" containerID="f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.071810 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91"} err="failed to get container status \"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91\": rpc error: code = NotFound desc = could not find container \"f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91\": container with ID starting with f0083991125c81497bf6a04c8206e5b9c69a1a9142d3427eb003effd5eee3d91 not found: ID does not exist" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.076290 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-config-data" (OuterVolumeSpecName: "config-data") pod "779a99b7-5e9d-4399-aef7-bdc09971b060" (UID: "779a99b7-5e9d-4399-aef7-bdc09971b060"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.090763 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "779a99b7-5e9d-4399-aef7-bdc09971b060" (UID: "779a99b7-5e9d-4399-aef7-bdc09971b060"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.100284 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "779a99b7-5e9d-4399-aef7-bdc09971b060" (UID: "779a99b7-5e9d-4399-aef7-bdc09971b060"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.135443 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.135474 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv6px\" (UniqueName: \"kubernetes.io/projected/779a99b7-5e9d-4399-aef7-bdc09971b060-kube-api-access-xv6px\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.135497 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.135507 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/779a99b7-5e9d-4399-aef7-bdc09971b060-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.135516 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.135524 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/779a99b7-5e9d-4399-aef7-bdc09971b060-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.379692 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.389315 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.402824 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.403276 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-api" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403302 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-api" Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.403318 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerName="dnsmasq-dns" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403326 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerName="dnsmasq-dns" Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.403340 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-log" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403348 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-log" Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.403371 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b313db-9404-4f6c-8998-800ea3110fc9" containerName="nova-manage" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403379 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b313db-9404-4f6c-8998-800ea3110fc9" containerName="nova-manage" Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.403394 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerName="init" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403401 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerName="init" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403652 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f272e6a0-6e5c-4b7b-9118-a3ee4adae73f" containerName="dnsmasq-dns" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403681 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-api" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403707 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b313db-9404-4f6c-8998-800ea3110fc9" containerName="nova-manage" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.403725 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" containerName="nova-api-log" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.405849 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.407774 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.408259 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.411747 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.417036 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.542552 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-public-tls-certs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.542639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85085ef-a23e-41f4-8839-08915aaaef7e-logs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.542679 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nf8m\" (UniqueName: \"kubernetes.io/projected/f85085ef-a23e-41f4-8839-08915aaaef7e-kube-api-access-9nf8m\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.542728 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.542821 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-config-data\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.542844 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.644225 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-config-data\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.644281 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.644362 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-public-tls-certs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.644406 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85085ef-a23e-41f4-8839-08915aaaef7e-logs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.644445 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nf8m\" (UniqueName: \"kubernetes.io/projected/f85085ef-a23e-41f4-8839-08915aaaef7e-kube-api-access-9nf8m\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.644498 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.645155 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85085ef-a23e-41f4-8839-08915aaaef7e-logs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.648622 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.649767 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.650565 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-public-tls-certs\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.664562 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nf8m\" (UniqueName: \"kubernetes.io/projected/f85085ef-a23e-41f4-8839-08915aaaef7e-kube-api-access-9nf8m\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.665149 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-config-data\") pod \"nova-api-0\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: I0201 07:12:05.720375 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.918497 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.920627 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.922467 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:12:05 crc kubenswrapper[5127]: E0201 07:12:05.922502 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2f9738e6-08e9-42a5-b2c9-1b9f9d944239" containerName="nova-scheduler-scheduler" Feb 01 07:12:06 crc kubenswrapper[5127]: I0201 07:12:06.219614 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:06 crc kubenswrapper[5127]: W0201 07:12:06.221029 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf85085ef_a23e_41f4_8839_08915aaaef7e.slice/crio-b52b7fce573e5b869172d9ae9f26a0cacbe7fa672a45539ebf3f37972819a8d7 WatchSource:0}: Error finding container b52b7fce573e5b869172d9ae9f26a0cacbe7fa672a45539ebf3f37972819a8d7: Status 404 returned error can't find the container with id b52b7fce573e5b869172d9ae9f26a0cacbe7fa672a45539ebf3f37972819a8d7 Feb 01 07:12:06 crc kubenswrapper[5127]: I0201 07:12:06.260312 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="779a99b7-5e9d-4399-aef7-bdc09971b060" path="/var/lib/kubelet/pods/779a99b7-5e9d-4399-aef7-bdc09971b060/volumes" Feb 01 07:12:06 crc kubenswrapper[5127]: I0201 07:12:06.740672 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:12:06 crc kubenswrapper[5127]: I0201 07:12:06.741065 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:12:06 crc kubenswrapper[5127]: I0201 07:12:06.741127 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:12:06 crc kubenswrapper[5127]: I0201 07:12:06.742051 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea328ac3a1fecb168f70daa3f3e516c02a9891b33e1e0a73db9093353737c6c6"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:12:06 crc kubenswrapper[5127]: I0201 07:12:06.742121 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://ea328ac3a1fecb168f70daa3f3e516c02a9891b33e1e0a73db9093353737c6c6" gracePeriod=600 Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.054649 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="ea328ac3a1fecb168f70daa3f3e516c02a9891b33e1e0a73db9093353737c6c6" exitCode=0 Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.054787 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"ea328ac3a1fecb168f70daa3f3e516c02a9891b33e1e0a73db9093353737c6c6"} Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.055095 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e"} Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.055119 5127 scope.go:117] "RemoveContainer" containerID="0702e12609ce38f8f96c08a0dc24be3679aca29131a880c9fa0e9bf1dfbadcf5" Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.058228 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f85085ef-a23e-41f4-8839-08915aaaef7e","Type":"ContainerStarted","Data":"603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe"} Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.058262 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f85085ef-a23e-41f4-8839-08915aaaef7e","Type":"ContainerStarted","Data":"f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a"} Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.058272 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f85085ef-a23e-41f4-8839-08915aaaef7e","Type":"ContainerStarted","Data":"b52b7fce573e5b869172d9ae9f26a0cacbe7fa672a45539ebf3f37972819a8d7"} Feb 01 07:12:07 crc kubenswrapper[5127]: I0201 07:12:07.092921 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.09289717 podStartE2EDuration="2.09289717s" podCreationTimestamp="2026-02-01 07:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:12:07.08343506 +0000 UTC m=+1477.569337423" watchObservedRunningTime="2026-02-01 07:12:07.09289717 +0000 UTC m=+1477.578799533" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.891980 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.987640 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-combined-ca-bundle\") pod \"07697040-f619-40c4-af39-0959e02a1db8\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.987705 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-nova-metadata-tls-certs\") pod \"07697040-f619-40c4-af39-0959e02a1db8\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.987775 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-config-data\") pod \"07697040-f619-40c4-af39-0959e02a1db8\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.987865 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07697040-f619-40c4-af39-0959e02a1db8-logs\") pod \"07697040-f619-40c4-af39-0959e02a1db8\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.987929 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89qqq\" (UniqueName: \"kubernetes.io/projected/07697040-f619-40c4-af39-0959e02a1db8-kube-api-access-89qqq\") pod \"07697040-f619-40c4-af39-0959e02a1db8\" (UID: \"07697040-f619-40c4-af39-0959e02a1db8\") " Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.990486 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07697040-f619-40c4-af39-0959e02a1db8-logs" (OuterVolumeSpecName: "logs") pod "07697040-f619-40c4-af39-0959e02a1db8" (UID: "07697040-f619-40c4-af39-0959e02a1db8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:07.997692 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07697040-f619-40c4-af39-0959e02a1db8-kube-api-access-89qqq" (OuterVolumeSpecName: "kube-api-access-89qqq") pod "07697040-f619-40c4-af39-0959e02a1db8" (UID: "07697040-f619-40c4-af39-0959e02a1db8"). InnerVolumeSpecName "kube-api-access-89qqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.021425 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07697040-f619-40c4-af39-0959e02a1db8" (UID: "07697040-f619-40c4-af39-0959e02a1db8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.046406 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-config-data" (OuterVolumeSpecName: "config-data") pod "07697040-f619-40c4-af39-0959e02a1db8" (UID: "07697040-f619-40c4-af39-0959e02a1db8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.049741 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "07697040-f619-40c4-af39-0959e02a1db8" (UID: "07697040-f619-40c4-af39-0959e02a1db8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.069715 5127 generic.go:334] "Generic (PLEG): container finished" podID="07697040-f619-40c4-af39-0959e02a1db8" containerID="5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045" exitCode=0 Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.069862 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.069931 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07697040-f619-40c4-af39-0959e02a1db8","Type":"ContainerDied","Data":"5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045"} Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.069978 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"07697040-f619-40c4-af39-0959e02a1db8","Type":"ContainerDied","Data":"55e02135dd6d48cfacc2b665ad5b536b2ee48bdd18d3d3a2254845cf426732ca"} Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.070005 5127 scope.go:117] "RemoveContainer" containerID="5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.091866 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.091904 5127 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.091922 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07697040-f619-40c4-af39-0959e02a1db8-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.091936 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07697040-f619-40c4-af39-0959e02a1db8-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.091951 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89qqq\" (UniqueName: \"kubernetes.io/projected/07697040-f619-40c4-af39-0959e02a1db8-kube-api-access-89qqq\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.113819 5127 scope.go:117] "RemoveContainer" containerID="ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.129717 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.141618 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.142733 5127 scope.go:117] "RemoveContainer" containerID="5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045" Feb 01 07:12:08 crc kubenswrapper[5127]: E0201 07:12:08.143143 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045\": container with ID starting with 5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045 not found: ID does not exist" containerID="5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.143180 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045"} err="failed to get container status \"5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045\": rpc error: code = NotFound desc = could not find container \"5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045\": container with ID starting with 5a17164957cb8c104d0cb7611afdf793003b5106ad8df22bbd52d2a8221a5045 not found: ID does not exist" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.143204 5127 scope.go:117] "RemoveContainer" containerID="ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d" Feb 01 07:12:08 crc kubenswrapper[5127]: E0201 07:12:08.143452 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d\": container with ID starting with ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d not found: ID does not exist" containerID="ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.143477 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d"} err="failed to get container status \"ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d\": rpc error: code = NotFound desc = could not find container \"ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d\": container with ID starting with ebff1109d836e7fc4e1cc8fc7d02b273017bb0c3c3150a58ecae9b15aeb2c10d not found: ID does not exist" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.152286 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:08 crc kubenswrapper[5127]: E0201 07:12:08.152864 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-metadata" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.152881 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-metadata" Feb 01 07:12:08 crc kubenswrapper[5127]: E0201 07:12:08.152890 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-log" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.152900 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-log" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.153076 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-metadata" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.153106 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="07697040-f619-40c4-af39-0959e02a1db8" containerName="nova-metadata-log" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.154040 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.156263 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.156696 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.161131 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.260651 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07697040-f619-40c4-af39-0959e02a1db8" path="/var/lib/kubelet/pods/07697040-f619-40c4-af39-0959e02a1db8/volumes" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.295930 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.295999 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.296108 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hkfb\" (UniqueName: \"kubernetes.io/projected/aed0e157-f34a-4343-ae3b-71e045eb4cf4-kube-api-access-6hkfb\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.296280 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-config-data\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.296533 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed0e157-f34a-4343-ae3b-71e045eb4cf4-logs\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.398096 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hkfb\" (UniqueName: \"kubernetes.io/projected/aed0e157-f34a-4343-ae3b-71e045eb4cf4-kube-api-access-6hkfb\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.398525 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-config-data\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.398607 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed0e157-f34a-4343-ae3b-71e045eb4cf4-logs\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.398711 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.398744 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.399726 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed0e157-f34a-4343-ae3b-71e045eb4cf4-logs\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.402407 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.403181 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-config-data\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.403249 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.419604 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hkfb\" (UniqueName: \"kubernetes.io/projected/aed0e157-f34a-4343-ae3b-71e045eb4cf4-kube-api-access-6hkfb\") pod \"nova-metadata-0\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.478604 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:12:08 crc kubenswrapper[5127]: I0201 07:12:08.944865 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:08 crc kubenswrapper[5127]: W0201 07:12:08.955117 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaed0e157_f34a_4343_ae3b_71e045eb4cf4.slice/crio-053e414d70c59a0901ee3b9b4b7930d0344dd5350a87e82274d651eb5343cab4 WatchSource:0}: Error finding container 053e414d70c59a0901ee3b9b4b7930d0344dd5350a87e82274d651eb5343cab4: Status 404 returned error can't find the container with id 053e414d70c59a0901ee3b9b4b7930d0344dd5350a87e82274d651eb5343cab4 Feb 01 07:12:09 crc kubenswrapper[5127]: I0201 07:12:09.087118 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed0e157-f34a-4343-ae3b-71e045eb4cf4","Type":"ContainerStarted","Data":"053e414d70c59a0901ee3b9b4b7930d0344dd5350a87e82274d651eb5343cab4"} Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.107820 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed0e157-f34a-4343-ae3b-71e045eb4cf4","Type":"ContainerStarted","Data":"f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277"} Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.110334 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed0e157-f34a-4343-ae3b-71e045eb4cf4","Type":"ContainerStarted","Data":"fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75"} Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.112684 5127 generic.go:334] "Generic (PLEG): container finished" podID="2f9738e6-08e9-42a5-b2c9-1b9f9d944239" containerID="38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c" exitCode=0 Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.112843 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f9738e6-08e9-42a5-b2c9-1b9f9d944239","Type":"ContainerDied","Data":"38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c"} Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.112955 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f9738e6-08e9-42a5-b2c9-1b9f9d944239","Type":"ContainerDied","Data":"25dc2d27260112cda63cd9aff283cd2b5346fec66b4c2549e53b1b80a266369c"} Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.113098 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dc2d27260112cda63cd9aff283cd2b5346fec66b4c2549e53b1b80a266369c" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.133380 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.133360196 podStartE2EDuration="2.133360196s" podCreationTimestamp="2026-02-01 07:12:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:12:10.132674229 +0000 UTC m=+1480.618576602" watchObservedRunningTime="2026-02-01 07:12:10.133360196 +0000 UTC m=+1480.619262569" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.158158 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.171415 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlhcr\" (UniqueName: \"kubernetes.io/projected/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-kube-api-access-jlhcr\") pod \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.171750 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-combined-ca-bundle\") pod \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.171916 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-config-data\") pod \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\" (UID: \"2f9738e6-08e9-42a5-b2c9-1b9f9d944239\") " Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.182396 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-kube-api-access-jlhcr" (OuterVolumeSpecName: "kube-api-access-jlhcr") pod "2f9738e6-08e9-42a5-b2c9-1b9f9d944239" (UID: "2f9738e6-08e9-42a5-b2c9-1b9f9d944239"). InnerVolumeSpecName "kube-api-access-jlhcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.207950 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f9738e6-08e9-42a5-b2c9-1b9f9d944239" (UID: "2f9738e6-08e9-42a5-b2c9-1b9f9d944239"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.246408 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-config-data" (OuterVolumeSpecName: "config-data") pod "2f9738e6-08e9-42a5-b2c9-1b9f9d944239" (UID: "2f9738e6-08e9-42a5-b2c9-1b9f9d944239"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.273801 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlhcr\" (UniqueName: \"kubernetes.io/projected/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-kube-api-access-jlhcr\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.273852 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:10 crc kubenswrapper[5127]: I0201 07:12:10.273874 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f9738e6-08e9-42a5-b2c9-1b9f9d944239-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.136465 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.174084 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.194666 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.203186 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:11 crc kubenswrapper[5127]: E0201 07:12:11.203766 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f9738e6-08e9-42a5-b2c9-1b9f9d944239" containerName="nova-scheduler-scheduler" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.203790 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f9738e6-08e9-42a5-b2c9-1b9f9d944239" containerName="nova-scheduler-scheduler" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.204050 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f9738e6-08e9-42a5-b2c9-1b9f9d944239" containerName="nova-scheduler-scheduler" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.204860 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.214457 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.216234 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.391342 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tcff\" (UniqueName: \"kubernetes.io/projected/644a363d-bd2b-4cb5-81bf-05f7514d7abe-kube-api-access-7tcff\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.392108 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-config-data\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.392187 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.494666 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tcff\" (UniqueName: \"kubernetes.io/projected/644a363d-bd2b-4cb5-81bf-05f7514d7abe-kube-api-access-7tcff\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.494799 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-config-data\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.494836 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.505107 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.506662 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-config-data\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.519705 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tcff\" (UniqueName: \"kubernetes.io/projected/644a363d-bd2b-4cb5-81bf-05f7514d7abe-kube-api-access-7tcff\") pod \"nova-scheduler-0\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " pod="openstack/nova-scheduler-0" Feb 01 07:12:11 crc kubenswrapper[5127]: I0201 07:12:11.540861 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:12:12 crc kubenswrapper[5127]: I0201 07:12:12.044133 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:12 crc kubenswrapper[5127]: W0201 07:12:12.049990 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod644a363d_bd2b_4cb5_81bf_05f7514d7abe.slice/crio-930039a4d4d51370415637b7777b5091e70fe46a10eda86b6e553d1455075f78 WatchSource:0}: Error finding container 930039a4d4d51370415637b7777b5091e70fe46a10eda86b6e553d1455075f78: Status 404 returned error can't find the container with id 930039a4d4d51370415637b7777b5091e70fe46a10eda86b6e553d1455075f78 Feb 01 07:12:12 crc kubenswrapper[5127]: I0201 07:12:12.147959 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"644a363d-bd2b-4cb5-81bf-05f7514d7abe","Type":"ContainerStarted","Data":"930039a4d4d51370415637b7777b5091e70fe46a10eda86b6e553d1455075f78"} Feb 01 07:12:12 crc kubenswrapper[5127]: I0201 07:12:12.248084 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f9738e6-08e9-42a5-b2c9-1b9f9d944239" path="/var/lib/kubelet/pods/2f9738e6-08e9-42a5-b2c9-1b9f9d944239/volumes" Feb 01 07:12:13 crc kubenswrapper[5127]: I0201 07:12:13.164885 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"644a363d-bd2b-4cb5-81bf-05f7514d7abe","Type":"ContainerStarted","Data":"e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137"} Feb 01 07:12:13 crc kubenswrapper[5127]: I0201 07:12:13.201928 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.201900192 podStartE2EDuration="2.201900192s" podCreationTimestamp="2026-02-01 07:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:12:13.193413419 +0000 UTC m=+1483.679315832" watchObservedRunningTime="2026-02-01 07:12:13.201900192 +0000 UTC m=+1483.687802555" Feb 01 07:12:13 crc kubenswrapper[5127]: I0201 07:12:13.479305 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:12:13 crc kubenswrapper[5127]: I0201 07:12:13.479380 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 07:12:15 crc kubenswrapper[5127]: I0201 07:12:15.721825 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:12:15 crc kubenswrapper[5127]: I0201 07:12:15.723794 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 07:12:16 crc kubenswrapper[5127]: I0201 07:12:16.541982 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 07:12:16 crc kubenswrapper[5127]: I0201 07:12:16.737850 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:12:16 crc kubenswrapper[5127]: I0201 07:12:16.737921 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:12:18 crc kubenswrapper[5127]: I0201 07:12:18.488696 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:12:18 crc kubenswrapper[5127]: I0201 07:12:18.489120 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 07:12:19 crc kubenswrapper[5127]: I0201 07:12:19.500808 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:12:19 crc kubenswrapper[5127]: I0201 07:12:19.500796 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.542047 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.588795 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.808040 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hkhkn"] Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.810181 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.823571 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hkhkn"] Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.915211 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vp7j\" (UniqueName: \"kubernetes.io/projected/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-kube-api-access-5vp7j\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.915280 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-catalog-content\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:21 crc kubenswrapper[5127]: I0201 07:12:21.915414 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-utilities\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.016912 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vp7j\" (UniqueName: \"kubernetes.io/projected/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-kube-api-access-5vp7j\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.016984 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-catalog-content\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.017055 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-utilities\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.017466 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-utilities\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.017637 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-catalog-content\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.042918 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vp7j\" (UniqueName: \"kubernetes.io/projected/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-kube-api-access-5vp7j\") pod \"redhat-operators-hkhkn\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.153961 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.344723 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 07:12:22 crc kubenswrapper[5127]: I0201 07:12:22.677780 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hkhkn"] Feb 01 07:12:23 crc kubenswrapper[5127]: I0201 07:12:23.274004 5127 generic.go:334] "Generic (PLEG): container finished" podID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerID="a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675" exitCode=0 Feb 01 07:12:23 crc kubenswrapper[5127]: I0201 07:12:23.274129 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkhkn" event={"ID":"e6fed8b9-3dfb-499c-9cc5-13d4835fff68","Type":"ContainerDied","Data":"a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675"} Feb 01 07:12:23 crc kubenswrapper[5127]: I0201 07:12:23.274291 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkhkn" event={"ID":"e6fed8b9-3dfb-499c-9cc5-13d4835fff68","Type":"ContainerStarted","Data":"415dab1394fcb324b8d0312c50bcfa58446a45c69e6535f81f32b985e88814d4"} Feb 01 07:12:24 crc kubenswrapper[5127]: I0201 07:12:24.287258 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkhkn" event={"ID":"e6fed8b9-3dfb-499c-9cc5-13d4835fff68","Type":"ContainerStarted","Data":"38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0"} Feb 01 07:12:24 crc kubenswrapper[5127]: I0201 07:12:24.335564 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 07:12:25 crc kubenswrapper[5127]: I0201 07:12:25.733970 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:12:25 crc kubenswrapper[5127]: I0201 07:12:25.734319 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 07:12:25 crc kubenswrapper[5127]: I0201 07:12:25.734749 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:12:25 crc kubenswrapper[5127]: I0201 07:12:25.734766 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 07:12:25 crc kubenswrapper[5127]: I0201 07:12:25.757981 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:12:25 crc kubenswrapper[5127]: I0201 07:12:25.759204 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 07:12:26 crc kubenswrapper[5127]: I0201 07:12:26.311318 5127 generic.go:334] "Generic (PLEG): container finished" podID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerID="38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0" exitCode=0 Feb 01 07:12:26 crc kubenswrapper[5127]: I0201 07:12:26.311360 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkhkn" event={"ID":"e6fed8b9-3dfb-499c-9cc5-13d4835fff68","Type":"ContainerDied","Data":"38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0"} Feb 01 07:12:28 crc kubenswrapper[5127]: I0201 07:12:28.336399 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkhkn" event={"ID":"e6fed8b9-3dfb-499c-9cc5-13d4835fff68","Type":"ContainerStarted","Data":"2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6"} Feb 01 07:12:28 crc kubenswrapper[5127]: I0201 07:12:28.368183 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hkhkn" podStartSLOduration=3.238291505 podStartE2EDuration="7.368165407s" podCreationTimestamp="2026-02-01 07:12:21 +0000 UTC" firstStartedPulling="2026-02-01 07:12:23.275314824 +0000 UTC m=+1493.761217187" lastFinishedPulling="2026-02-01 07:12:27.405188696 +0000 UTC m=+1497.891091089" observedRunningTime="2026-02-01 07:12:28.359891863 +0000 UTC m=+1498.845794226" watchObservedRunningTime="2026-02-01 07:12:28.368165407 +0000 UTC m=+1498.854067770" Feb 01 07:12:28 crc kubenswrapper[5127]: I0201 07:12:28.486123 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 07:12:28 crc kubenswrapper[5127]: I0201 07:12:28.487710 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 07:12:28 crc kubenswrapper[5127]: I0201 07:12:28.494755 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 07:12:29 crc kubenswrapper[5127]: I0201 07:12:29.367786 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 07:12:32 crc kubenswrapper[5127]: I0201 07:12:32.154476 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:32 crc kubenswrapper[5127]: I0201 07:12:32.155122 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:33 crc kubenswrapper[5127]: I0201 07:12:33.212367 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hkhkn" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="registry-server" probeResult="failure" output=< Feb 01 07:12:33 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 07:12:33 crc kubenswrapper[5127]: > Feb 01 07:12:42 crc kubenswrapper[5127]: I0201 07:12:42.255545 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:42 crc kubenswrapper[5127]: I0201 07:12:42.349144 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:42 crc kubenswrapper[5127]: I0201 07:12:42.509723 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hkhkn"] Feb 01 07:12:43 crc kubenswrapper[5127]: I0201 07:12:43.530619 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hkhkn" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="registry-server" containerID="cri-o://2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6" gracePeriod=2 Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.051063 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.199118 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-utilities\") pod \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.199469 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vp7j\" (UniqueName: \"kubernetes.io/projected/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-kube-api-access-5vp7j\") pod \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.199515 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-catalog-content\") pod \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\" (UID: \"e6fed8b9-3dfb-499c-9cc5-13d4835fff68\") " Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.200057 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-utilities" (OuterVolumeSpecName: "utilities") pod "e6fed8b9-3dfb-499c-9cc5-13d4835fff68" (UID: "e6fed8b9-3dfb-499c-9cc5-13d4835fff68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.207148 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-kube-api-access-5vp7j" (OuterVolumeSpecName: "kube-api-access-5vp7j") pod "e6fed8b9-3dfb-499c-9cc5-13d4835fff68" (UID: "e6fed8b9-3dfb-499c-9cc5-13d4835fff68"). InnerVolumeSpecName "kube-api-access-5vp7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.301492 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vp7j\" (UniqueName: \"kubernetes.io/projected/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-kube-api-access-5vp7j\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.301524 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.324246 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6fed8b9-3dfb-499c-9cc5-13d4835fff68" (UID: "e6fed8b9-3dfb-499c-9cc5-13d4835fff68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.403468 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6fed8b9-3dfb-499c-9cc5-13d4835fff68-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.544842 5127 generic.go:334] "Generic (PLEG): container finished" podID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerID="2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6" exitCode=0 Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.544919 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hkhkn" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.544963 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkhkn" event={"ID":"e6fed8b9-3dfb-499c-9cc5-13d4835fff68","Type":"ContainerDied","Data":"2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6"} Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.545280 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hkhkn" event={"ID":"e6fed8b9-3dfb-499c-9cc5-13d4835fff68","Type":"ContainerDied","Data":"415dab1394fcb324b8d0312c50bcfa58446a45c69e6535f81f32b985e88814d4"} Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.545317 5127 scope.go:117] "RemoveContainer" containerID="2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.585890 5127 scope.go:117] "RemoveContainer" containerID="38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.613722 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hkhkn"] Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.633639 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hkhkn"] Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.634235 5127 scope.go:117] "RemoveContainer" containerID="a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.671471 5127 scope.go:117] "RemoveContainer" containerID="2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6" Feb 01 07:12:44 crc kubenswrapper[5127]: E0201 07:12:44.672123 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6\": container with ID starting with 2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6 not found: ID does not exist" containerID="2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.672162 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6"} err="failed to get container status \"2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6\": rpc error: code = NotFound desc = could not find container \"2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6\": container with ID starting with 2b6cf9e9158b0e55e011afe3083169b488f677b866c4ec2474e7c254044bcfb6 not found: ID does not exist" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.672189 5127 scope.go:117] "RemoveContainer" containerID="38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0" Feb 01 07:12:44 crc kubenswrapper[5127]: E0201 07:12:44.672557 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0\": container with ID starting with 38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0 not found: ID does not exist" containerID="38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.672601 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0"} err="failed to get container status \"38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0\": rpc error: code = NotFound desc = could not find container \"38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0\": container with ID starting with 38ae0ffc42e067fe8f2524d3c83e68af8a4a484f5dccc9804f65366d9fed0fc0 not found: ID does not exist" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.672619 5127 scope.go:117] "RemoveContainer" containerID="a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675" Feb 01 07:12:44 crc kubenswrapper[5127]: E0201 07:12:44.672955 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675\": container with ID starting with a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675 not found: ID does not exist" containerID="a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675" Feb 01 07:12:44 crc kubenswrapper[5127]: I0201 07:12:44.672983 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675"} err="failed to get container status \"a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675\": rpc error: code = NotFound desc = could not find container \"a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675\": container with ID starting with a64ad4d4337a953fe865adab82c403c2a096cfbb0332b7639f21bbb5eb2ec675 not found: ID does not exist" Feb 01 07:12:46 crc kubenswrapper[5127]: I0201 07:12:46.254070 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" path="/var/lib/kubelet/pods/e6fed8b9-3dfb-499c-9cc5-13d4835fff68/volumes" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.183425 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76b4c49b66-pjvd5"] Feb 01 07:12:48 crc kubenswrapper[5127]: E0201 07:12:48.184219 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="registry-server" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.184240 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="registry-server" Feb 01 07:12:48 crc kubenswrapper[5127]: E0201 07:12:48.184261 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="extract-content" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.184269 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="extract-content" Feb 01 07:12:48 crc kubenswrapper[5127]: E0201 07:12:48.184300 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="extract-utilities" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.184309 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="extract-utilities" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.184521 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fed8b9-3dfb-499c-9cc5-13d4835fff68" containerName="registry-server" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.185689 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.221977 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5759588f57-nkg6k"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.223509 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.266318 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76b4c49b66-pjvd5"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.303404 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.303755 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5w8x\" (UniqueName: \"kubernetes.io/projected/7ff7407e-28d1-4e89-829a-72a38dd882d7-kube-api-access-l5w8x\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.303835 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff7407e-28d1-4e89-829a-72a38dd882d7-logs\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.303887 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data-custom\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.303911 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-combined-ca-bundle\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.304000 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330cbe6-a302-4ac6-89ec-b5f3b5791503-logs\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.304083 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data-custom\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.304139 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.304203 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-combined-ca-bundle\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.304228 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wgt6\" (UniqueName: \"kubernetes.io/projected/1330cbe6-a302-4ac6-89ec-b5f3b5791503-kube-api-access-8wgt6\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.304427 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5759588f57-nkg6k"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.383089 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.386226 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a15e38c1-f8c8-4e6c-9e52-1b39e952017d" containerName="openstackclient" containerID="cri-o://4acd4b5b4ff519a2d04a0bd77806acea282f43ce9562e95499145235dc585912" gracePeriod=2 Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.405302 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406312 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-combined-ca-bundle\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406389 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330cbe6-a302-4ac6-89ec-b5f3b5791503-logs\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406456 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data-custom\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406482 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406524 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-combined-ca-bundle\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406563 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wgt6\" (UniqueName: \"kubernetes.io/projected/1330cbe6-a302-4ac6-89ec-b5f3b5791503-kube-api-access-8wgt6\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406630 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406652 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5w8x\" (UniqueName: \"kubernetes.io/projected/7ff7407e-28d1-4e89-829a-72a38dd882d7-kube-api-access-l5w8x\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.406702 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff7407e-28d1-4e89-829a-72a38dd882d7-logs\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.407088 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data-custom\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.413907 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330cbe6-a302-4ac6-89ec-b5f3b5791503-logs\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.416915 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff7407e-28d1-4e89-829a-72a38dd882d7-logs\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.428993 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data-custom\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.430391 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.431006 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data-custom\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.434548 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-combined-ca-bundle\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.452778 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.468711 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-combined-ca-bundle\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.481236 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wgt6\" (UniqueName: \"kubernetes.io/projected/1330cbe6-a302-4ac6-89ec-b5f3b5791503-kube-api-access-8wgt6\") pod \"barbican-keystone-listener-76b4c49b66-pjvd5\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.483256 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5w8x\" (UniqueName: \"kubernetes.io/projected/7ff7407e-28d1-4e89-829a-72a38dd882d7-kube-api-access-l5w8x\") pod \"barbican-worker-5759588f57-nkg6k\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.516306 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.555067 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.598706 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-l8j7b"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.784887 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-l8j7b"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.795548 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.813870 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6f5bs"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.814081 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-6f5bs" podUID="24a9fd1d-985f-497f-9b8e-773013dc8747" containerName="openstack-network-exporter" containerID="cri-o://fad044ef24a3873c346ee951546b90bf471b60d1b16cddbb5e20a468c5063b84" gracePeriod=30 Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.832760 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hqn86"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.869247 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-g7c4t"] Feb 01 07:12:48 crc kubenswrapper[5127]: E0201 07:12:48.869640 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15e38c1-f8c8-4e6c-9e52-1b39e952017d" containerName="openstackclient" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.869653 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15e38c1-f8c8-4e6c-9e52-1b39e952017d" containerName="openstackclient" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.869871 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15e38c1-f8c8-4e6c-9e52-1b39e952017d" containerName="openstackclient" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.870456 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.885280 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.905036 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9przj"] Feb 01 07:12:48 crc kubenswrapper[5127]: I0201 07:12:48.961645 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g7c4t"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.017377 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5x42\" (UniqueName: \"kubernetes.io/projected/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-kube-api-access-s5x42\") pod \"root-account-create-update-g7c4t\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.017501 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts\") pod \"root-account-create-update-g7c4t\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.017618 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.017670 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:49.517652769 +0000 UTC m=+1520.003555132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.033659 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-w7586"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.116794 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-w7f6k"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.122428 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts\") pod \"root-account-create-update-g7c4t\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.122549 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5x42\" (UniqueName: \"kubernetes.io/projected/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-kube-api-access-s5x42\") pod \"root-account-create-update-g7c4t\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.123357 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts\") pod \"root-account-create-update-g7c4t\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.163965 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-bb8a-account-create-update-xrpts"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.174366 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.184704 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5x42\" (UniqueName: \"kubernetes.io/projected/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-kube-api-access-s5x42\") pod \"root-account-create-update-g7c4t\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.186997 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.196048 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-w7586"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.264487 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-w7f6k"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.280122 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.334663 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-xrpts"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.328639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnnn5\" (UniqueName: \"kubernetes.io/projected/a6a4a416-4347-4df8-80b1-edfa74abfe7e-kube-api-access-hnnn5\") pod \"nova-api-bb8a-account-create-update-xrpts\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.340852 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a4a416-4347-4df8-80b1-edfa74abfe7e-operator-scripts\") pod \"nova-api-bb8a-account-create-update-xrpts\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.364431 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4519-account-create-update-8tjgc"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.411870 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.432709 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.442521 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-str96\" (UniqueName: \"kubernetes.io/projected/7fbdc342-25af-4968-ad4c-5b294a488e39-kube-api-access-str96\") pod \"nova-cell0-4519-account-create-update-8tjgc\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.442575 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnnn5\" (UniqueName: \"kubernetes.io/projected/a6a4a416-4347-4df8-80b1-edfa74abfe7e-kube-api-access-hnnn5\") pod \"nova-api-bb8a-account-create-update-xrpts\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.442649 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a4a416-4347-4df8-80b1-edfa74abfe7e-operator-scripts\") pod \"nova-api-bb8a-account-create-update-xrpts\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.442669 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fbdc342-25af-4968-ad4c-5b294a488e39-operator-scripts\") pod \"nova-cell0-4519-account-create-update-8tjgc\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.443717 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a4a416-4347-4df8-80b1-edfa74abfe7e-operator-scripts\") pod \"nova-api-bb8a-account-create-update-xrpts\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.450615 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-8tjgc"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.510533 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jmcld"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.526389 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnnn5\" (UniqueName: \"kubernetes.io/projected/a6a4a416-4347-4df8-80b1-edfa74abfe7e-kube-api-access-hnnn5\") pod \"nova-api-bb8a-account-create-update-xrpts\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.542519 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jmcld"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.544188 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fbdc342-25af-4968-ad4c-5b294a488e39-operator-scripts\") pod \"nova-cell0-4519-account-create-update-8tjgc\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.544335 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-str96\" (UniqueName: \"kubernetes.io/projected/7fbdc342-25af-4968-ad4c-5b294a488e39-kube-api-access-str96\") pod \"nova-cell0-4519-account-create-update-8tjgc\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.545157 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fbdc342-25af-4968-ad4c-5b294a488e39-operator-scripts\") pod \"nova-cell0-4519-account-create-update-8tjgc\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.545167 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.545442 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:50.545421742 +0000 UTC m=+1521.031324105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.582042 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-4qmcj"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.604093 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-str96\" (UniqueName: \"kubernetes.io/projected/7fbdc342-25af-4968-ad4c-5b294a488e39-kube-api-access-str96\") pod \"nova-cell0-4519-account-create-update-8tjgc\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.604167 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b29e-account-create-update-tsc7k"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.630871 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.690638 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-4qmcj"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.701794 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b29e-account-create-update-tsc7k"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.716009 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.716246 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="ovn-northd" containerID="cri-o://0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625" gracePeriod=30 Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.716475 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="openstack-network-exporter" containerID="cri-o://44b08e72c489b008fa46527782b6bdc9a481d3a4439b530c26416808e1a4301f" gracePeriod=30 Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.764650 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-293b-account-create-update-9chp8"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.786470 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-293b-account-create-update-9chp8"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.811180 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9eab-account-create-update-f4jqw"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.812688 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9eab-account-create-update-f4jqw"] Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.827632 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerName="galera" probeResult="failure" output="command timed out" Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.869371 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a9fd1d_985f_497f_9b8e_773013dc8747.slice/crio-fad044ef24a3873c346ee951546b90bf471b60d1b16cddbb5e20a468c5063b84.scope\": RecentStats: unable to find data in memory cache]" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.871705 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.877748 5127 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-hqn86" message=< Feb 01 07:12:49 crc kubenswrapper[5127]: Exiting ovn-controller (1) [ OK ] Feb 01 07:12:49 crc kubenswrapper[5127]: > Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.877787 5127 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-hqn86" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" containerID="cri-o://86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.877843 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-hqn86" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" containerID="cri-o://86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349" gracePeriod=29 Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.889346 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6f5bs_24a9fd1d-985f-497f-9b8e-773013dc8747/openstack-network-exporter/0.log" Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.893661 5127 generic.go:334] "Generic (PLEG): container finished" podID="24a9fd1d-985f-497f-9b8e-773013dc8747" containerID="fad044ef24a3873c346ee951546b90bf471b60d1b16cddbb5e20a468c5063b84" exitCode=2 Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.893720 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6f5bs" event={"ID":"24a9fd1d-985f-497f-9b8e-773013dc8747","Type":"ContainerDied","Data":"fad044ef24a3873c346ee951546b90bf471b60d1b16cddbb5e20a468c5063b84"} Feb 01 07:12:49 crc kubenswrapper[5127]: I0201 07:12:49.904686 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ssck9"] Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.941176 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349 is running failed: container process not found" containerID="86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.946069 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349 is running failed: container process not found" containerID="86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.962003 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349 is running failed: container process not found" containerID="86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 01 07:12:49 crc kubenswrapper[5127]: E0201 07:12:49.962088 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-hqn86" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.080936 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ssck9"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.097119 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-z9xf9"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.139655 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-z9xf9"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.171764 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.172699 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="openstack-network-exporter" containerID="cri-o://49ff01000f18ae004dcef08ba577c2e60ccd6f97ac2dd571eedb3934f8d4d73e" gracePeriod=300 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.191624 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-mxrjb"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.209268 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5ng2k"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.222399 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-mxrjb"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.231441 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5ng2k"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.258173 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d62e96f-7e79-4c05-8c2e-2656ef444f4a" path="/var/lib/kubelet/pods/2d62e96f-7e79-4c05-8c2e-2656ef444f4a/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.258944 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45bf5410-d21e-44a3-b4c7-11fdd25902d0" path="/var/lib/kubelet/pods/45bf5410-d21e-44a3-b4c7-11fdd25902d0/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.259455 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4d5a37-3a02-493f-9cf9-d53931c2a92b" path="/var/lib/kubelet/pods/4f4d5a37-3a02-493f-9cf9-d53931c2a92b/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.262029 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657c0b79-3594-4a70-a7de-6152741e8148" path="/var/lib/kubelet/pods/657c0b79-3594-4a70-a7de-6152741e8148/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.262548 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e62122a-43cd-4d84-a3b9-2ab7472dbf1c" path="/var/lib/kubelet/pods/6e62122a-43cd-4d84-a3b9-2ab7472dbf1c/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.265546 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7692c2d1-b96e-4d2d-b0b8-039a5125c9b8" path="/var/lib/kubelet/pods/7692c2d1-b96e-4d2d-b0b8-039a5125c9b8/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.266878 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8876b764-7eab-4430-ae8d-b0d88f3f4394" path="/var/lib/kubelet/pods/8876b764-7eab-4430-ae8d-b0d88f3f4394/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.267401 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a4da05-deae-4395-a91b-b8ddfb804f8a" path="/var/lib/kubelet/pods/c6a4da05-deae-4395-a91b-b8ddfb804f8a/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.268097 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9af14ed-135f-45d2-9aca-55513eb0e860" path="/var/lib/kubelet/pods/c9af14ed-135f-45d2-9aca-55513eb0e860/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.269489 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc5df89b-8911-4464-8b7f-c9716a7243ea" path="/var/lib/kubelet/pods/dc5df89b-8911-4464-8b7f-c9716a7243ea/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.270264 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e206ee74-3e4a-48d2-b7d1-af07cd542f72" path="/var/lib/kubelet/pods/e206ee74-3e4a-48d2-b7d1-af07cd542f72/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.273660 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe1b844d-6fec-4f41-83d3-62fe94a2aa43" path="/var/lib/kubelet/pods/fe1b844d-6fec-4f41-83d3-62fe94a2aa43/volumes" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.276706 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vgtk6"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.291822 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vgtk6"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.303389 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.303680 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="cinder-scheduler" containerID="cri-o://45488eefbe618c6ed70968bb3a79848f397c02da3176113bc9124b98acb538e2" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.304092 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="probe" containerID="cri-o://0be8d4cb9574063f87962b5663f7c99862b6167cbe906b2f8987098ff021beff" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.309730 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="ovsdbserver-nb" containerID="cri-o://df507cd5ec54f237e6d768044e6d52556d60d036c40521c2cb898928ad478155" gracePeriod=300 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.316033 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.316590 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="openstack-network-exporter" containerID="cri-o://00429248c74c1cbdec0c992d840fe52b3fb9bf53f5c7b39b33a5b2a1b7997c03" gracePeriod=300 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.330465 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.330708 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api-log" containerID="cri-o://3ef4cc3b13aaf195c0e9ab17d2d878bd41f2a1d4e67c807b8411510d47ddce71" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.331041 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api" containerID="cri-o://c118e21ba6efeaa3a7ba640aedf062451fb2a67b8769dd72709cae85ff970c12" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.364093 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.364538 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-log" containerID="cri-o://6c3858c14ef4c311b1deda9d45684f86e030100946c594b504545c60e4d6512d" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.364807 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-httpd" containerID="cri-o://bb0ddfad39e508e52c1255ed282f7f8d3226087a328247cb63a34d4e3ddca978" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.391148 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5759588f57-nkg6k"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.412553 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fbd756774-8bz24"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.412802 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fbd756774-8bz24" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-log" containerID="cri-o://280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.413139 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6fbd756774-8bz24" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-api" containerID="cri-o://bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.426324 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.426549 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-log" containerID="cri-o://9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.426976 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-httpd" containerID="cri-o://fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.436638 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-47sg7"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.462632 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-47sg7"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.471935 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-fhj5m"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.472190 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" podUID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerName="dnsmasq-dns" containerID="cri-o://f162bb848cfee5be37c6d67f9f232905d8e7c65a774425c0a49d943f58e74593" gracePeriod=10 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487084 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487634 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-server" containerID="cri-o://867db1559afad96de83c300d6dc76b9f79d3c9220b0e2eb9728b097d71713a33" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487638 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-expirer" containerID="cri-o://5653ed02c5b90531b86d9ac767b79937dac0b76281e108a3a937c34943529698" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487757 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-updater" containerID="cri-o://95be98dc047c279bbce09d7aa189270919803433cfe5dc74d1073beb651e9b25" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487787 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="rsync" containerID="cri-o://005c8c714d8be3311a798fc93522b27e5504130f9c1fa418f83c1ab86906035c" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487769 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="swift-recon-cron" containerID="cri-o://2346499dd9e7c21de3823593069c8520d97c16bb6dede126e55ac71fc4a085b0" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487844 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-auditor" containerID="cri-o://914b8bb69bc3bfe2d7935699ef76aca574042432793c4d5754b940ebe207865b" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487858 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-replicator" containerID="cri-o://d54876c49569e6c608f8538949b55b1c199573d434261c07be1e7783a323003f" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487885 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-server" containerID="cri-o://c9704dce7fc0e07cc3a655f4772728e2831f5da440ded50d8d887f0c56f5d13f" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487893 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-replicator" containerID="cri-o://d37333ecd6017a5cdc098711dfbdfa4e7ddb88dafd4fb0421fa3c8183a90db30" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487918 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-reaper" containerID="cri-o://38b76cedf92a4bf003f4c614f64605b3a7cbd585d2e9ecb5e1043de091b2dd25" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487924 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-server" containerID="cri-o://761c57fdee0d6b1288274f98290bb8cd974e5bc157c50992d2820212429734cd" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487948 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-auditor" containerID="cri-o://2ba3574e531a65aa332d467e2a747abc31633121e49ff04e0c8f64ec009d6670" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487957 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-updater" containerID="cri-o://70ea6924342a0f91d794401284c82d8dac971be34d4d31d1be2404903e52efc7" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.487980 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-replicator" containerID="cri-o://d7b1d3dad0001903399762e7d439bb31968d3d63d3ab70bde24fdd9f1e6316ee" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.488007 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-auditor" containerID="cri-o://cd0408279605bd61bef597ecbbdac3b1f047aa35e8239141c5d37982ce44fb47" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.504602 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bcb954fdc-q646r"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.504842 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bcb954fdc-q646r" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-api" containerID="cri-o://3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.504982 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7bcb954fdc-q646r" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-httpd" containerID="cri-o://1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.505830 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="ovsdbserver-sb" containerID="cri-o://b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f" gracePeriod=300 Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.506011 5127 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.22:54574->38.102.83.22:39685: write tcp 38.102.83.22:54574->38.102.83.22:39685: write: broken pipe Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.520775 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jr4gn"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.534755 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jr4gn"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.563323 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c35b-account-create-update-fclsw"] Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.572840 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.576457 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c35b-account-create-update-fclsw"] Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.583727 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.593748 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f" cmd=["/usr/bin/pidof","ovsdb-server"] Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.593819 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="ovsdbserver-sb" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.610020 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.610232 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-log" containerID="cri-o://fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.610678 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-metadata" containerID="cri-o://f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.626772 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-g4264"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.635761 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-g4264"] Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.641544 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.641634 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:52.641613635 +0000 UTC m=+1523.127516008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.654757 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bfb0-account-create-update-mjdkv"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.664426 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bfb0-account-create-update-mjdkv"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.676265 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.676510 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-log" containerID="cri-o://f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.676996 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-api" containerID="cri-o://603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.682117 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f7d4bc459-g6tgf"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.682298 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-httpd" containerID="cri-o://8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.682384 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-server" containerID="cri-o://f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.694540 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-l2pjk"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.702187 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-l2pjk"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.708115 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6px8t"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.720674 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6px8t"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.739093 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-grk2z"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.782842 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-grk2z"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.785629 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" containerID="cri-o://3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" gracePeriod=29 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.827191 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.837463 5127 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 01 07:12:50 crc kubenswrapper[5127]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 01 07:12:50 crc kubenswrapper[5127]: + source /usr/local/bin/container-scripts/functions Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNBridge=br-int Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNRemote=tcp:localhost:6642 Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNEncapType=geneve Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNAvailabilityZones= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ EnableChassisAsGateway=true Feb 01 07:12:50 crc kubenswrapper[5127]: ++ PhysicalNetworks= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNHostName= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 01 07:12:50 crc kubenswrapper[5127]: ++ ovs_dir=/var/lib/openvswitch Feb 01 07:12:50 crc kubenswrapper[5127]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 01 07:12:50 crc kubenswrapper[5127]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 01 07:12:50 crc kubenswrapper[5127]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + cleanup_ovsdb_server_semaphore Feb 01 07:12:50 crc kubenswrapper[5127]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 01 07:12:50 crc kubenswrapper[5127]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 01 07:12:50 crc kubenswrapper[5127]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-9przj" message=< Feb 01 07:12:50 crc kubenswrapper[5127]: Exiting ovsdb-server (5) [ OK ] Feb 01 07:12:50 crc kubenswrapper[5127]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 01 07:12:50 crc kubenswrapper[5127]: + source /usr/local/bin/container-scripts/functions Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNBridge=br-int Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNRemote=tcp:localhost:6642 Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNEncapType=geneve Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNAvailabilityZones= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ EnableChassisAsGateway=true Feb 01 07:12:50 crc kubenswrapper[5127]: ++ PhysicalNetworks= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNHostName= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 01 07:12:50 crc kubenswrapper[5127]: ++ ovs_dir=/var/lib/openvswitch Feb 01 07:12:50 crc kubenswrapper[5127]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 01 07:12:50 crc kubenswrapper[5127]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 01 07:12:50 crc kubenswrapper[5127]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + cleanup_ovsdb_server_semaphore Feb 01 07:12:50 crc kubenswrapper[5127]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 01 07:12:50 crc kubenswrapper[5127]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 01 07:12:50 crc kubenswrapper[5127]: > Feb 01 07:12:50 crc kubenswrapper[5127]: E0201 07:12:50.837502 5127 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 01 07:12:50 crc kubenswrapper[5127]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 01 07:12:50 crc kubenswrapper[5127]: + source /usr/local/bin/container-scripts/functions Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNBridge=br-int Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNRemote=tcp:localhost:6642 Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNEncapType=geneve Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNAvailabilityZones= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ EnableChassisAsGateway=true Feb 01 07:12:50 crc kubenswrapper[5127]: ++ PhysicalNetworks= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ OVNHostName= Feb 01 07:12:50 crc kubenswrapper[5127]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 01 07:12:50 crc kubenswrapper[5127]: ++ ovs_dir=/var/lib/openvswitch Feb 01 07:12:50 crc kubenswrapper[5127]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 01 07:12:50 crc kubenswrapper[5127]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 01 07:12:50 crc kubenswrapper[5127]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + sleep 0.5 Feb 01 07:12:50 crc kubenswrapper[5127]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 01 07:12:50 crc kubenswrapper[5127]: + cleanup_ovsdb_server_semaphore Feb 01 07:12:50 crc kubenswrapper[5127]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 01 07:12:50 crc kubenswrapper[5127]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 01 07:12:50 crc kubenswrapper[5127]: > pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" containerID="cri-o://4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.837546 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" containerID="cri-o://4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" gracePeriod=29 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.892743 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ad9f-account-create-update-mv52x"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.904492 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7698d9bdb9-bwmxd"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.904747 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker-log" containerID="cri-o://a3a77f3f69d363acbcf4efc5d0f20f16e293179511ce86f0cdd3c1b58066afa5" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.906478 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker" containerID="cri-o://29d8d027dbe06246751c1b56e85016b77f2dd4ca87ded166e55fa2c4832c64ec" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.911461 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nn8d9"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.917113 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ad9f-account-create-update-mv52x"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.929395 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nn8d9"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.938523 5127 generic.go:334] "Generic (PLEG): container finished" podID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerID="fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75" exitCode=143 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.938619 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed0e157-f34a-4343-ae3b-71e045eb4cf4","Type":"ContainerDied","Data":"fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75"} Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.944496 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5759588f57-nkg6k"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.950985 5127 generic.go:334] "Generic (PLEG): container finished" podID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerID="44b08e72c489b008fa46527782b6bdc9a481d3a4439b530c26416808e1a4301f" exitCode=2 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.951064 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c50e0a2-f119-4a1a-911f-f7898cceddb8","Type":"ContainerDied","Data":"44b08e72c489b008fa46527782b6bdc9a481d3a4439b530c26416808e1a4301f"} Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.954118 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-c8jq7"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.956103 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5759588f57-nkg6k" event={"ID":"7ff7407e-28d1-4e89-829a-72a38dd882d7","Type":"ContainerStarted","Data":"d23c240700660131c41788080e6a4a7bff561ffb7789deafac9dc5832ba94354"} Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.956151 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5759588f57-nkg6k" event={"ID":"7ff7407e-28d1-4e89-829a-72a38dd882d7","Type":"ContainerStarted","Data":"d28cff8b39b659cdf4b7caa9656f240d35e8ee4cd34b71e9e2c6158d6335ee31"} Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.964600 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-c8jq7"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.977975 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-xrpts"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.982305 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.982574 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://59ccaa8d2bba84519b6c4dd0057f50e45fb9b19f2e881b045bcfc6bbe203275b" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.984803 5127 generic.go:334] "Generic (PLEG): container finished" podID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerID="6c3858c14ef4c311b1deda9d45684f86e030100946c594b504545c60e4d6512d" exitCode=143 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.984887 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d","Type":"ContainerDied","Data":"6c3858c14ef4c311b1deda9d45684f86e030100946c594b504545c60e4d6512d"} Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.989266 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6b5bcb8846-2gxlg"] Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.989500 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener-log" containerID="cri-o://a3ab6404657e8a50a3cc043680876f80e95b3982cb32682933e72885d036811f" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.989594 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener" containerID="cri-o://c29366b00ecfb7dffff5a9a80692040e245c0a01c1fbbaf5d4d101f0738c006c" gracePeriod=30 Feb 01 07:12:50 crc kubenswrapper[5127]: I0201 07:12:50.995546 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-sflr2"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.005434 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-76b4c49b66-pjvd5"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.013169 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6f5bs_24a9fd1d-985f-497f-9b8e-773013dc8747/openstack-network-exporter/0.log" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.013230 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.020557 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-8tjgc"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.027912 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6969499d9b-sjxsr"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.028165 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6969499d9b-sjxsr" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api-log" containerID="cri-o://829b8906c7d8a005a0f0715b5027bcf6b0f42ef7cc11158f5d59737c1d368916" gracePeriod=30 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.028329 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6969499d9b-sjxsr" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api" containerID="cri-o://62c7e7aeed632c501e98dba48dbb0ca73647880b2adcc2c89f331f126d30002a" gracePeriod=30 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.045963 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-sflr2"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.056304 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_36a8a4ac-b308-4bb8-be43-dddca18b1bc1/ovsdbserver-sb/0.log" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.056337 5127 generic.go:334] "Generic (PLEG): container finished" podID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerID="00429248c74c1cbdec0c992d840fe52b3fb9bf53f5c7b39b33a5b2a1b7997c03" exitCode=2 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.056352 5127 generic.go:334] "Generic (PLEG): container finished" podID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerID="b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f" exitCode=143 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.056389 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"36a8a4ac-b308-4bb8-be43-dddca18b1bc1","Type":"ContainerDied","Data":"00429248c74c1cbdec0c992d840fe52b3fb9bf53f5c7b39b33a5b2a1b7997c03"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.056412 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"36a8a4ac-b308-4bb8-be43-dddca18b1bc1","Type":"ContainerDied","Data":"b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.058290 5127 generic.go:334] "Generic (PLEG): container finished" podID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerID="280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f" exitCode=143 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.058317 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fbd756774-8bz24" event={"ID":"79f921c6-ec0a-46f5-b3c3-5d479690d0e5","Type":"ContainerDied","Data":"280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.063082 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.063241 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="644a363d-bd2b-4cb5-81bf-05f7514d7abe" containerName="nova-scheduler-scheduler" containerID="cri-o://e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" gracePeriod=30 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.076571 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-76b4c49b66-pjvd5"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.078095 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d523dcf2-c3fd-4473-ae9b-27e64a77205d/ovsdbserver-nb/0.log" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.078285 5127 generic.go:334] "Generic (PLEG): container finished" podID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerID="49ff01000f18ae004dcef08ba577c2e60ccd6f97ac2dd571eedb3934f8d4d73e" exitCode=2 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.078307 5127 generic.go:334] "Generic (PLEG): container finished" podID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerID="df507cd5ec54f237e6d768044e6d52556d60d036c40521c2cb898928ad478155" exitCode=143 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.078356 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d523dcf2-c3fd-4473-ae9b-27e64a77205d","Type":"ContainerDied","Data":"49ff01000f18ae004dcef08ba577c2e60ccd6f97ac2dd571eedb3934f8d4d73e"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.078383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d523dcf2-c3fd-4473-ae9b-27e64a77205d","Type":"ContainerDied","Data":"df507cd5ec54f237e6d768044e6d52556d60d036c40521c2cb898928ad478155"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.102443 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a9fd1d-985f-497f-9b8e-773013dc8747-config\") pod \"24a9fd1d-985f-497f-9b8e-773013dc8747\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.102525 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/24a9fd1d-985f-497f-9b8e-773013dc8747-kube-api-access-qtvtt\") pod \"24a9fd1d-985f-497f-9b8e-773013dc8747\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.102677 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovs-rundir\") pod \"24a9fd1d-985f-497f-9b8e-773013dc8747\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.102757 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovn-rundir\") pod \"24a9fd1d-985f-497f-9b8e-773013dc8747\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.102877 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-metrics-certs-tls-certs\") pod \"24a9fd1d-985f-497f-9b8e-773013dc8747\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.102932 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-combined-ca-bundle\") pod \"24a9fd1d-985f-497f-9b8e-773013dc8747\" (UID: \"24a9fd1d-985f-497f-9b8e-773013dc8747\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.104105 5127 generic.go:334] "Generic (PLEG): container finished" podID="4b0be460-5699-4787-9c9e-90df6400faed" containerID="86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.104162 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86" event={"ID":"4b0be460-5699-4787-9c9e-90df6400faed","Type":"ContainerDied","Data":"86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.104226 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "24a9fd1d-985f-497f-9b8e-773013dc8747" (UID: "24a9fd1d-985f-497f-9b8e-773013dc8747"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.105647 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "24a9fd1d-985f-497f-9b8e-773013dc8747" (UID: "24a9fd1d-985f-497f-9b8e-773013dc8747"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.117742 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a9fd1d-985f-497f-9b8e-773013dc8747-config" (OuterVolumeSpecName: "config") pod "24a9fd1d-985f-497f-9b8e-773013dc8747" (UID: "24a9fd1d-985f-497f-9b8e-773013dc8747"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.118259 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a9fd1d-985f-497f-9b8e-773013dc8747-kube-api-access-qtvtt" (OuterVolumeSpecName: "kube-api-access-qtvtt") pod "24a9fd1d-985f-497f-9b8e-773013dc8747" (UID: "24a9fd1d-985f-497f-9b8e-773013dc8747"). InnerVolumeSpecName "kube-api-access-qtvtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.135205 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24a9fd1d-985f-497f-9b8e-773013dc8747" (UID: "24a9fd1d-985f-497f-9b8e-773013dc8747"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.153951 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerName="galera" containerID="cri-o://69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd" gracePeriod=30 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.155889 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-xrpts"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163162 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="5653ed02c5b90531b86d9ac767b79937dac0b76281e108a3a937c34943529698" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163194 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="95be98dc047c279bbce09d7aa189270919803433cfe5dc74d1073beb651e9b25" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163204 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="914b8bb69bc3bfe2d7935699ef76aca574042432793c4d5754b940ebe207865b" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163211 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="d37333ecd6017a5cdc098711dfbdfa4e7ddb88dafd4fb0421fa3c8183a90db30" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163217 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="70ea6924342a0f91d794401284c82d8dac971be34d4d31d1be2404903e52efc7" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163223 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="cd0408279605bd61bef597ecbbdac3b1f047aa35e8239141c5d37982ce44fb47" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163229 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="d54876c49569e6c608f8538949b55b1c199573d434261c07be1e7783a323003f" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163235 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="38b76cedf92a4bf003f4c614f64605b3a7cbd585d2e9ecb5e1043de091b2dd25" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163241 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="2ba3574e531a65aa332d467e2a747abc31633121e49ff04e0c8f64ec009d6670" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163247 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="d7b1d3dad0001903399762e7d439bb31968d3d63d3ab70bde24fdd9f1e6316ee" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163284 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"5653ed02c5b90531b86d9ac767b79937dac0b76281e108a3a937c34943529698"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163307 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"95be98dc047c279bbce09d7aa189270919803433cfe5dc74d1073beb651e9b25"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163317 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"914b8bb69bc3bfe2d7935699ef76aca574042432793c4d5754b940ebe207865b"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163326 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"d37333ecd6017a5cdc098711dfbdfa4e7ddb88dafd4fb0421fa3c8183a90db30"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163334 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"70ea6924342a0f91d794401284c82d8dac971be34d4d31d1be2404903e52efc7"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163342 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"cd0408279605bd61bef597ecbbdac3b1f047aa35e8239141c5d37982ce44fb47"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163351 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"d54876c49569e6c608f8538949b55b1c199573d434261c07be1e7783a323003f"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163360 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"38b76cedf92a4bf003f4c614f64605b3a7cbd585d2e9ecb5e1043de091b2dd25"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163369 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"2ba3574e531a65aa332d467e2a747abc31633121e49ff04e0c8f64ec009d6670"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.163378 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"d7b1d3dad0001903399762e7d439bb31968d3d63d3ab70bde24fdd9f1e6316ee"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.175704 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-g7c4t"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.189023 5127 generic.go:334] "Generic (PLEG): container finished" podID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerID="f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a" exitCode=143 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.189108 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f85085ef-a23e-41f4-8839-08915aaaef7e","Type":"ContainerDied","Data":"f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a"} Feb 01 07:12:51 crc kubenswrapper[5127]: W0201 07:12:51.202735 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61a37fc0_b8b7_4bbc_ab43_2dae28037ee0.slice/crio-c7dd845f3b9d3aa1343a1d795c0aa15b072ab1c8aaf2aa2de019e03dcc1a6054 WatchSource:0}: Error finding container c7dd845f3b9d3aa1343a1d795c0aa15b072ab1c8aaf2aa2de019e03dcc1a6054: Status 404 returned error can't find the container with id c7dd845f3b9d3aa1343a1d795c0aa15b072ab1c8aaf2aa2de019e03dcc1a6054 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.207967 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.208000 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a9fd1d-985f-497f-9b8e-773013dc8747-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.208010 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtvtt\" (UniqueName: \"kubernetes.io/projected/24a9fd1d-985f-497f-9b8e-773013dc8747-kube-api-access-qtvtt\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.208019 5127 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.208027 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24a9fd1d-985f-497f-9b8e-773013dc8747-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.217661 5127 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 01 07:12:51 crc kubenswrapper[5127]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: if [ -n "nova_api" ]; then Feb 01 07:12:51 crc kubenswrapper[5127]: GRANT_DATABASE="nova_api" Feb 01 07:12:51 crc kubenswrapper[5127]: else Feb 01 07:12:51 crc kubenswrapper[5127]: GRANT_DATABASE="*" Feb 01 07:12:51 crc kubenswrapper[5127]: fi Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: # going for maximum compatibility here: Feb 01 07:12:51 crc kubenswrapper[5127]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 01 07:12:51 crc kubenswrapper[5127]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 01 07:12:51 crc kubenswrapper[5127]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 01 07:12:51 crc kubenswrapper[5127]: # support updates Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: $MYSQL_CMD < logger="UnhandledError" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.219511 5127 generic.go:334] "Generic (PLEG): container finished" podID="a3845481-effe-4cb2-9249-e9311df519a0" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.219710 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9przj" event={"ID":"a3845481-effe-4cb2-9249-e9311df519a0","Type":"ContainerDied","Data":"4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783"} Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.219969 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-bb8a-account-create-update-xrpts" podUID="a6a4a416-4347-4df8-80b1-edfa74abfe7e" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.232069 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "24a9fd1d-985f-497f-9b8e-773013dc8747" (UID: "24a9fd1d-985f-497f-9b8e-773013dc8747"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.240443 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6f5bs_24a9fd1d-985f-497f-9b8e-773013dc8747/openstack-network-exporter/0.log" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.240530 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6f5bs" event={"ID":"24a9fd1d-985f-497f-9b8e-773013dc8747","Type":"ContainerDied","Data":"a6f3ae1340374b5cb40c86734020c01cdc48a6be1d2f748b1d07a133c4ed4260"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.240556 5127 scope.go:117] "RemoveContainer" containerID="fad044ef24a3873c346ee951546b90bf471b60d1b16cddbb5e20a468c5063b84" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.240690 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6f5bs" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.281922 5127 generic.go:334] "Generic (PLEG): container finished" podID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerID="f162bb848cfee5be37c6d67f9f232905d8e7c65a774425c0a49d943f58e74593" exitCode=0 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.282033 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" event={"ID":"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a","Type":"ContainerDied","Data":"f162bb848cfee5be37c6d67f9f232905d8e7c65a774425c0a49d943f58e74593"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.299371 5127 generic.go:334] "Generic (PLEG): container finished" podID="cdafa63d-9b24-454c-a217-e53024719e75" containerID="3ef4cc3b13aaf195c0e9ab17d2d878bd41f2a1d4e67c807b8411510d47ddce71" exitCode=143 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.299540 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cdafa63d-9b24-454c-a217-e53024719e75","Type":"ContainerDied","Data":"3ef4cc3b13aaf195c0e9ab17d2d878bd41f2a1d4e67c807b8411510d47ddce71"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.305247 5127 generic.go:334] "Generic (PLEG): container finished" podID="a15e38c1-f8c8-4e6c-9e52-1b39e952017d" containerID="4acd4b5b4ff519a2d04a0bd77806acea282f43ce9562e95499145235dc585912" exitCode=137 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.310671 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6f5bs"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.310749 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24a9fd1d-985f-497f-9b8e-773013dc8747-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.320474 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerID="9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf" exitCode=143 Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.320516 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d6754e0-125e-446b-8ef2-fc58883f6c76","Type":"ContainerDied","Data":"9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf"} Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.327014 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-6f5bs"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.366413 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.529186 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-ovn-controller-tls-certs\") pod \"4b0be460-5699-4787-9c9e-90df6400faed\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.529540 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-log-ovn\") pod \"4b0be460-5699-4787-9c9e-90df6400faed\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.529574 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b0be460-5699-4787-9c9e-90df6400faed-scripts\") pod \"4b0be460-5699-4787-9c9e-90df6400faed\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.529612 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run-ovn\") pod \"4b0be460-5699-4787-9c9e-90df6400faed\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.529688 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-combined-ca-bundle\") pod \"4b0be460-5699-4787-9c9e-90df6400faed\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.529715 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57jr5\" (UniqueName: \"kubernetes.io/projected/4b0be460-5699-4787-9c9e-90df6400faed-kube-api-access-57jr5\") pod \"4b0be460-5699-4787-9c9e-90df6400faed\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.529743 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run\") pod \"4b0be460-5699-4787-9c9e-90df6400faed\" (UID: \"4b0be460-5699-4787-9c9e-90df6400faed\") " Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.530171 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run" (OuterVolumeSpecName: "var-run") pod "4b0be460-5699-4787-9c9e-90df6400faed" (UID: "4b0be460-5699-4787-9c9e-90df6400faed"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.530209 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4b0be460-5699-4787-9c9e-90df6400faed" (UID: "4b0be460-5699-4787-9c9e-90df6400faed"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.531895 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b0be460-5699-4787-9c9e-90df6400faed-scripts" (OuterVolumeSpecName: "scripts") pod "4b0be460-5699-4787-9c9e-90df6400faed" (UID: "4b0be460-5699-4787-9c9e-90df6400faed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.531934 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4b0be460-5699-4787-9c9e-90df6400faed" (UID: "4b0be460-5699-4787-9c9e-90df6400faed"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.555747 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.555771 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0be460-5699-4787-9c9e-90df6400faed-kube-api-access-57jr5" (OuterVolumeSpecName: "kube-api-access-57jr5") pod "4b0be460-5699-4787-9c9e-90df6400faed" (UID: "4b0be460-5699-4787-9c9e-90df6400faed"). InnerVolumeSpecName "kube-api-access-57jr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.564807 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.580937 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.581411 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="644a363d-bd2b-4cb5-81bf-05f7514d7abe" containerName="nova-scheduler-scheduler" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.632192 5127 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.632227 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b0be460-5699-4787-9c9e-90df6400faed-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.632237 5127 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.632245 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57jr5\" (UniqueName: \"kubernetes.io/projected/4b0be460-5699-4787-9c9e-90df6400faed-kube-api-access-57jr5\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.632255 5127 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4b0be460-5699-4787-9c9e-90df6400faed-var-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.686066 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.714850 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b0be460-5699-4787-9c9e-90df6400faed" (UID: "4b0be460-5699-4787-9c9e-90df6400faed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.724169 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.734485 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.808882 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "4b0be460-5699-4787-9c9e-90df6400faed" (UID: "4b0be460-5699-4787-9c9e-90df6400faed"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.833639 5127 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 01 07:12:51 crc kubenswrapper[5127]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:ed0f8ba03f3ce47a32006d730c3049455325eb2c3b98b9fd6b3fb9901004df13,Command:[/bin/sh -c #!/bin/bash Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: if [ -n "nova_cell0" ]; then Feb 01 07:12:51 crc kubenswrapper[5127]: GRANT_DATABASE="nova_cell0" Feb 01 07:12:51 crc kubenswrapper[5127]: else Feb 01 07:12:51 crc kubenswrapper[5127]: GRANT_DATABASE="*" Feb 01 07:12:51 crc kubenswrapper[5127]: fi Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: # going for maximum compatibility here: Feb 01 07:12:51 crc kubenswrapper[5127]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 01 07:12:51 crc kubenswrapper[5127]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 01 07:12:51 crc kubenswrapper[5127]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 01 07:12:51 crc kubenswrapper[5127]: # support updates Feb 01 07:12:51 crc kubenswrapper[5127]: Feb 01 07:12:51 crc kubenswrapper[5127]: $MYSQL_CMD < logger="UnhandledError" Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.835821 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-4519-account-create-update-8tjgc" podUID="7fbdc342-25af-4968-ad4c-5b294a488e39" Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.840340 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b0be460-5699-4787-9c9e-90df6400faed-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.840553 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.840570 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-8tjgc"] Feb 01 07:12:51 crc kubenswrapper[5127]: E0201 07:12:51.840658 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data podName:824fc658-1c02-4470-9ed3-e4123ddd7575 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:52.34063995 +0000 UTC m=+1522.826542313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data") pod "rabbitmq-cell1-server-0" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575") : configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:51 crc kubenswrapper[5127]: I0201 07:12:51.956559 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.000501 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.002266 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d523dcf2-c3fd-4473-ae9b-27e64a77205d/ovsdbserver-nb/0.log" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.002334 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.025950 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_36a8a4ac-b308-4bb8-be43-dddca18b1bc1/ovsdbserver-sb/0.log" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.026022 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.082501 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153132 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdbserver-sb-tls-certs\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153213 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-svc\") pod \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153278 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdb-rundir\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153296 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-scripts\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153330 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-config\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153357 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-metrics-certs-tls-certs\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153390 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-combined-ca-bundle\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153409 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-sb\") pod \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153441 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k955w\" (UniqueName: \"kubernetes.io/projected/d523dcf2-c3fd-4473-ae9b-27e64a77205d-kube-api-access-k955w\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153456 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-config\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153479 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-combined-ca-bundle\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153497 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4wnn\" (UniqueName: \"kubernetes.io/projected/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-kube-api-access-c4wnn\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153511 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153531 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-metrics-certs-tls-certs\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153548 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153569 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-config\") pod \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153814 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-nb\") pod \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153835 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-swift-storage-0\") pod \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153852 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdbserver-nb-tls-certs\") pod \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\" (UID: \"d523dcf2-c3fd-4473-ae9b-27e64a77205d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153907 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-scripts\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153928 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9xgp\" (UniqueName: \"kubernetes.io/projected/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-kube-api-access-t9xgp\") pod \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\" (UID: \"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.153950 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdb-rundir\") pod \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\" (UID: \"36a8a4ac-b308-4bb8-be43-dddca18b1bc1\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.154733 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.163827 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.182390 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-scripts" (OuterVolumeSpecName: "scripts") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.182837 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-config" (OuterVolumeSpecName: "config") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.188889 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-scripts" (OuterVolumeSpecName: "scripts") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.216417 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-config" (OuterVolumeSpecName: "config") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.257730 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-combined-ca-bundle\") pod \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.257770 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config-secret\") pod \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.257898 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config\") pod \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.257982 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26lmx\" (UniqueName: \"kubernetes.io/projected/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-kube-api-access-26lmx\") pod \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\" (UID: \"a15e38c1-f8c8-4e6c-9e52-1b39e952017d\") " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.258413 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.258424 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.258434 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.258443 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.258450 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d523dcf2-c3fd-4473-ae9b-27e64a77205d-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.258458 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.305857 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.306073 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-kube-api-access-t9xgp" (OuterVolumeSpecName: "kube-api-access-t9xgp") pod "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" (UID: "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a"). InnerVolumeSpecName "kube-api-access-t9xgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.306166 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d523dcf2-c3fd-4473-ae9b-27e64a77205d-kube-api-access-k955w" (OuterVolumeSpecName: "kube-api-access-k955w") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "kube-api-access-k955w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.318375 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-kube-api-access-c4wnn" (OuterVolumeSpecName: "kube-api-access-c4wnn") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "kube-api-access-c4wnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.321840 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.379385 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12387291-7208-4df1-b142-486a24065f71" path="/var/lib/kubelet/pods/12387291-7208-4df1-b142-486a24065f71/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.380944 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b313db-9404-4f6c-8998-800ea3110fc9" path="/var/lib/kubelet/pods/21b313db-9404-4f6c-8998-800ea3110fc9/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.381839 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a9fd1d-985f-497f-9b8e-773013dc8747" path="/var/lib/kubelet/pods/24a9fd1d-985f-497f-9b8e-773013dc8747/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.383280 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254e67ea-20e8-4960-ae74-c4d1bff0369a" path="/var/lib/kubelet/pods/254e67ea-20e8-4960-ae74-c4d1bff0369a/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.384265 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379e85af-3108-4c83-88cb-a71948674382" path="/var/lib/kubelet/pods/379e85af-3108-4c83-88cb-a71948674382/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.384884 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2" path="/var/lib/kubelet/pods/579b0c45-f8f2-4d42-8ad7-3a1ba479fdd2/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.385520 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ad9d98-24ab-40fe-ac49-63b423cd33de" path="/var/lib/kubelet/pods/64ad9d98-24ab-40fe-ac49-63b423cd33de/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.394190 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e86c9ae-529c-41fd-89e6-08de90de4684" path="/var/lib/kubelet/pods/6e86c9ae-529c-41fd-89e6-08de90de4684/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.395329 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afdf7282-6160-40d2-b9fa-803bf081e1b6" path="/var/lib/kubelet/pods/afdf7282-6160-40d2-b9fa-803bf081e1b6/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.396323 5127 generic.go:334] "Generic (PLEG): container finished" podID="29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" containerID="59ccaa8d2bba84519b6c4dd0057f50e45fb9b19f2e881b045bcfc6bbe203275b" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.407859 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-kube-api-access-26lmx" (OuterVolumeSpecName: "kube-api-access-26lmx") pod "a15e38c1-f8c8-4e6c-9e52-1b39e952017d" (UID: "a15e38c1-f8c8-4e6c-9e52-1b39e952017d"). InnerVolumeSpecName "kube-api-access-26lmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.408939 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4fae6b8-8d43-4df5-b5c8-4482bf865a73" path="/var/lib/kubelet/pods/b4fae6b8-8d43-4df5-b5c8-4482bf865a73/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.409312 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k955w\" (UniqueName: \"kubernetes.io/projected/d523dcf2-c3fd-4473-ae9b-27e64a77205d-kube-api-access-k955w\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: E0201 07:12:52.409376 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:52 crc kubenswrapper[5127]: E0201 07:12:52.409448 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data podName:824fc658-1c02-4470-9ed3-e4123ddd7575 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:53.409417971 +0000 UTC m=+1523.895320344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data") pod "rabbitmq-cell1-server-0" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575") : configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.410615 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c2b589-6308-42bc-8b1e-c2d4f3e210b1" path="/var/lib/kubelet/pods/c4c2b589-6308-42bc-8b1e-c2d4f3e210b1/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.415228 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4wnn\" (UniqueName: \"kubernetes.io/projected/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-kube-api-access-c4wnn\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.416442 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.416465 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.416478 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9xgp\" (UniqueName: \"kubernetes.io/projected/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-kube-api-access-t9xgp\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.427178 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64132aa-a148-422a-8d80-b92f9005a34f" path="/var/lib/kubelet/pods/c64132aa-a148-422a-8d80-b92f9005a34f/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.437111 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_36a8a4ac-b308-4bb8-be43-dddca18b1bc1/ovsdbserver-sb/0.log" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.437283 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.439042 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ce0d4c-bd50-4466-83fb-68bea7c4ed61" path="/var/lib/kubelet/pods/e9ce0d4c-bd50-4466-83fb-68bea7c4ed61/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.452159 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb0ca89-32ae-4796-8b4e-b9ac35cfaafd" path="/var/lib/kubelet/pods/feb0ca89-32ae-4796-8b4e-b9ac35cfaafd/volumes" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.482376 5127 generic.go:334] "Generic (PLEG): container finished" podID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerID="a3a77f3f69d363acbcf4efc5d0f20f16e293179511ce86f0cdd3c1b58066afa5" exitCode=143 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.504703 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hqn86" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.517900 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26lmx\" (UniqueName: \"kubernetes.io/projected/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-kube-api-access-26lmx\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.536134 5127 generic.go:334] "Generic (PLEG): container finished" podID="48898154-9be0-400f-8e0b-ef721132db71" containerID="0be8d4cb9574063f87962b5663f7c99862b6167cbe906b2f8987098ff021beff" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.539764 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.552989 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener-log" containerID="cri-o://967655f08a7adad63c2db4ecc270313cb506f4dbb7d2de93e05145f31cc59387" gracePeriod=30 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.553480 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener" containerID="cri-o://4af9462363cbd843b41f9156dcb55cc0f9bf5eaaa495c0ecf057de6e3872a505" gracePeriod=30 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.592003 5127 generic.go:334] "Generic (PLEG): container finished" podID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerID="1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.600269 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" podStartSLOduration=4.600249451 podStartE2EDuration="4.600249451s" podCreationTimestamp="2026-02-01 07:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:12:52.575553504 +0000 UTC m=+1523.061455887" watchObservedRunningTime="2026-02-01 07:12:52.600249451 +0000 UTC m=+1523.086151814" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.616022 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d523dcf2-c3fd-4473-ae9b-27e64a77205d/ovsdbserver-nb/0.log" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.616183 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.632725 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.638026 5127 generic.go:334] "Generic (PLEG): container finished" podID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerID="f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.638062 5127 generic.go:334] "Generic (PLEG): container finished" podID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerID="8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.648228 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.656928 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.677939 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.682771 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.692813 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.696403 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" (UID: "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.704226 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a15e38c1-f8c8-4e6c-9e52-1b39e952017d" (UID: "a15e38c1-f8c8-4e6c-9e52-1b39e952017d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.712666 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5759588f57-nkg6k" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker-log" containerID="cri-o://d23c240700660131c41788080e6a4a7bff561ffb7789deafac9dc5832ba94354" gracePeriod=30 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.713214 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5759588f57-nkg6k" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker" containerID="cri-o://8aebd54ceb5f8e1f71a8b3d2cb3b9f0e38e504b94503477ac609f10e118d1895" gracePeriod=30 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.728101 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.728121 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.728130 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.728139 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.728148 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.728157 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.728165 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: E0201 07:12:52.728218 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:12:52 crc kubenswrapper[5127]: E0201 07:12:52.728254 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:56.728240333 +0000 UTC m=+1527.214142696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.735927 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.741427 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" (UID: "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.754380 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5759588f57-nkg6k" podStartSLOduration=4.754358599 podStartE2EDuration="4.754358599s" podCreationTimestamp="2026-02-01 07:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:12:52.742296553 +0000 UTC m=+1523.228198916" watchObservedRunningTime="2026-02-01 07:12:52.754358599 +0000 UTC m=+1523.240260962" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.782756 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="005c8c714d8be3311a798fc93522b27e5504130f9c1fa418f83c1ab86906035c" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.782785 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="761c57fdee0d6b1288274f98290bb8cd974e5bc157c50992d2820212429734cd" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.782795 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="c9704dce7fc0e07cc3a655f4772728e2831f5da440ded50d8d887f0c56f5d13f" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.782804 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="867db1559afad96de83c300d6dc76b9f79d3c9220b0e2eb9728b097d71713a33" exitCode=0 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.796982 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-config" (OuterVolumeSpecName: "config") pod "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" (UID: "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.801114 5127 generic.go:334] "Generic (PLEG): container finished" podID="472be6e7-d046-4377-b055-50828b00b8cd" containerID="829b8906c7d8a005a0f0715b5027bcf6b0f42ef7cc11158f5d59737c1d368916" exitCode=143 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.802744 5127 generic.go:334] "Generic (PLEG): container finished" podID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerID="6163d1004ca352f7cf607b6d84cfea1f9bd6b47b3c1ec1ac1b077bbb590d2580" exitCode=1 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.803264 5127 scope.go:117] "RemoveContainer" containerID="6163d1004ca352f7cf607b6d84cfea1f9bd6b47b3c1ec1ac1b077bbb590d2580" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.811739 5127 generic.go:334] "Generic (PLEG): container finished" podID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerID="a3ab6404657e8a50a3cc043680876f80e95b3982cb32682933e72885d036811f" exitCode=143 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.829949 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.829970 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.829979 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.854386 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" (UID: "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.866860 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a15e38c1-f8c8-4e6c-9e52-1b39e952017d" (UID: "a15e38c1-f8c8-4e6c-9e52-1b39e952017d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.874547 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a15e38c1-f8c8-4e6c-9e52-1b39e952017d" (UID: "a15e38c1-f8c8-4e6c-9e52-1b39e952017d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.914685 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920172 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e","Type":"ContainerDied","Data":"59ccaa8d2bba84519b6c4dd0057f50e45fb9b19f2e881b045bcfc6bbe203275b"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920224 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"36a8a4ac-b308-4bb8-be43-dddca18b1bc1","Type":"ContainerDied","Data":"8f8fa59ab6f1441bef02c359e839cede60b6e8dc026480d5e11ec944daf69e38"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920240 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" event={"ID":"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a","Type":"ContainerDied","Data":"a3a77f3f69d363acbcf4efc5d0f20f16e293179511ce86f0cdd3c1b58066afa5"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920254 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hqn86" event={"ID":"4b0be460-5699-4787-9c9e-90df6400faed","Type":"ContainerDied","Data":"967078256ac9193fa1d832542effe48272709fdb43e54388aa1cc62d7d25f55e"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920266 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48898154-9be0-400f-8e0b-ef721132db71","Type":"ContainerDied","Data":"0be8d4cb9574063f87962b5663f7c99862b6167cbe906b2f8987098ff021beff"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920278 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-fhj5m" event={"ID":"41cdcc7d-28e4-4f12-a2b2-5052f7872c1a","Type":"ContainerDied","Data":"89799b9f4e14929be8b177e15bc077f3232094ccf9f7f0fd4cc2b3fc7c05cb79"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920291 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" event={"ID":"1330cbe6-a302-4ac6-89ec-b5f3b5791503","Type":"ContainerStarted","Data":"4af9462363cbd843b41f9156dcb55cc0f9bf5eaaa495c0ecf057de6e3872a505"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920301 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" event={"ID":"1330cbe6-a302-4ac6-89ec-b5f3b5791503","Type":"ContainerStarted","Data":"967655f08a7adad63c2db4ecc270313cb506f4dbb7d2de93e05145f31cc59387"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920309 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" event={"ID":"1330cbe6-a302-4ac6-89ec-b5f3b5791503","Type":"ContainerStarted","Data":"0f9652ab95a79490e372b040fabbf131ef3734793f6de210d667945bbfd031f8"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920320 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcb954fdc-q646r" event={"ID":"a63dd2b1-3f35-45bf-8e69-170e3e980eac","Type":"ContainerDied","Data":"1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920331 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d523dcf2-c3fd-4473-ae9b-27e64a77205d","Type":"ContainerDied","Data":"1e4b2acc6d88e8efd1357b52f1b803a8dec8b480fd4d2cef1e7c7c216de91616"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920342 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" event={"ID":"38d5ee07-f2ba-4a01-abab-aa8a58056a1b","Type":"ContainerDied","Data":"f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920352 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" event={"ID":"38d5ee07-f2ba-4a01-abab-aa8a58056a1b","Type":"ContainerDied","Data":"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920351 5127 scope.go:117] "RemoveContainer" containerID="00429248c74c1cbdec0c992d840fe52b3fb9bf53f5c7b39b33a5b2a1b7997c03" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920367 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5759588f57-nkg6k" event={"ID":"7ff7407e-28d1-4e89-829a-72a38dd882d7","Type":"ContainerStarted","Data":"8aebd54ceb5f8e1f71a8b3d2cb3b9f0e38e504b94503477ac609f10e118d1895"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920524 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"005c8c714d8be3311a798fc93522b27e5504130f9c1fa418f83c1ab86906035c"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920545 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"761c57fdee0d6b1288274f98290bb8cd974e5bc157c50992d2820212429734cd"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920555 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"c9704dce7fc0e07cc3a655f4772728e2831f5da440ded50d8d887f0c56f5d13f"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920564 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"867db1559afad96de83c300d6dc76b9f79d3c9220b0e2eb9728b097d71713a33"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920574 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6969499d9b-sjxsr" event={"ID":"472be6e7-d046-4377-b055-50828b00b8cd","Type":"ContainerDied","Data":"829b8906c7d8a005a0f0715b5027bcf6b0f42ef7cc11158f5d59737c1d368916"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920602 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g7c4t" event={"ID":"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0","Type":"ContainerDied","Data":"6163d1004ca352f7cf607b6d84cfea1f9bd6b47b3c1ec1ac1b077bbb590d2580"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920614 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g7c4t" event={"ID":"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0","Type":"ContainerStarted","Data":"c7dd845f3b9d3aa1343a1d795c0aa15b072ab1c8aaf2aa2de019e03dcc1a6054"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920623 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" event={"ID":"b8a6e525-1342-4031-8c3d-5920b8016c8e","Type":"ContainerDied","Data":"a3ab6404657e8a50a3cc043680876f80e95b3982cb32682933e72885d036811f"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920636 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4519-account-create-update-8tjgc" event={"ID":"7fbdc342-25af-4968-ad4c-5b294a488e39","Type":"ContainerStarted","Data":"d3a0381066378f0d603d60e9b65a88af0f351afcf5930dba92e3f5dd8c69a4b4"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.920665 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bb8a-account-create-update-xrpts" event={"ID":"a6a4a416-4347-4df8-80b1-edfa74abfe7e","Type":"ContainerStarted","Data":"593cb714131d2679f5fa969f42633f7e745a972bc17536edae7e3acfde1f8fa0"} Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.933821 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" (UID: "41cdcc7d-28e4-4f12-a2b2-5052f7872c1a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.935000 5127 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.935015 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.935024 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a15e38c1-f8c8-4e6c-9e52-1b39e952017d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.935033 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.984171 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hqn86"] Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.984483 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerName="rabbitmq" containerID="cri-o://9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395" gracePeriod=604800 Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.993087 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hqn86"] Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.994606 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "36a8a4ac-b308-4bb8-be43-dddca18b1bc1" (UID: "36a8a4ac-b308-4bb8-be43-dddca18b1bc1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.994846 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:12:52 crc kubenswrapper[5127]: I0201 07:12:52.996414 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.028735 5127 scope.go:117] "RemoveContainer" containerID="b18fd844746dcde376738e319e6c25da14f7a98194d8fe358010faf4ec0f974f" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.030735 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d523dcf2-c3fd-4473-ae9b-27e64a77205d" (UID: "d523dcf2-c3fd-4473-ae9b-27e64a77205d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.039168 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-combined-ca-bundle\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.040668 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-config-data\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.040703 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-log-httpd\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.040732 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q8w8\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-kube-api-access-4q8w8\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.040806 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-internal-tls-certs\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.040842 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-public-tls-certs\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.040864 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-etc-swift\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.040910 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-run-httpd\") pod \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\" (UID: \"38d5ee07-f2ba-4a01-abab-aa8a58056a1b\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.041302 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36a8a4ac-b308-4bb8-be43-dddca18b1bc1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.041319 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d523dcf2-c3fd-4473-ae9b-27e64a77205d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.042206 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.048253 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.074786 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-kube-api-access-4q8w8" (OuterVolumeSpecName: "kube-api-access-4q8w8") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "kube-api-access-4q8w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.078450 5127 scope.go:117] "RemoveContainer" containerID="86c82fa96979b94bd9d1fa42a155e2f7dc953fd90b307cb1c330b23a07563349" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.084390 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.116176 5127 scope.go:117] "RemoveContainer" containerID="f162bb848cfee5be37c6d67f9f232905d8e7c65a774425c0a49d943f58e74593" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.122408 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.139402 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149249 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-config-data\") pod \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149437 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-generated\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149500 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-nova-novncproxy-tls-certs\") pod \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149532 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-combined-ca-bundle\") pod \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149569 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-vencrypt-tls-certs\") pod \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149611 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-galera-tls-certs\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149642 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-696ct\" (UniqueName: \"kubernetes.io/projected/02abfc06-bde0-4894-a5f8-f07207f1ba28-kube-api-access-696ct\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149668 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d76r8\" (UniqueName: \"kubernetes.io/projected/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-kube-api-access-d76r8\") pod \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\" (UID: \"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149685 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-combined-ca-bundle\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149708 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-kolla-config\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149723 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-default\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149773 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.149833 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-operator-scripts\") pod \"02abfc06-bde0-4894-a5f8-f07207f1ba28\" (UID: \"02abfc06-bde0-4894-a5f8-f07207f1ba28\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.150290 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.150301 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q8w8\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-kube-api-access-4q8w8\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.150310 5127 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.150318 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.151002 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.153853 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.159501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.162408 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.166771 5127 scope.go:117] "RemoveContainer" containerID="0e8a3a74180f02556c4a75f8eb7281666b800f01ef25e92d264a64ed5ddcb187" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.194764 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02abfc06-bde0-4894-a5f8-f07207f1ba28-kube-api-access-696ct" (OuterVolumeSpecName: "kube-api-access-696ct") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "kube-api-access-696ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.194848 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-kube-api-access-d76r8" (OuterVolumeSpecName: "kube-api-access-d76r8") pod "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" (UID: "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e"). InnerVolumeSpecName "kube-api-access-d76r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.232894 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-fhj5m"] Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.241338 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-fhj5m"] Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.247051 5127 scope.go:117] "RemoveContainer" containerID="49ff01000f18ae004dcef08ba577c2e60ccd6f97ac2dd571eedb3934f8d4d73e" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.254252 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-696ct\" (UniqueName: \"kubernetes.io/projected/02abfc06-bde0-4894-a5f8-f07207f1ba28-kube-api-access-696ct\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.254280 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d76r8\" (UniqueName: \"kubernetes.io/projected/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-kube-api-access-d76r8\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.254289 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.254297 5127 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.254306 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02abfc06-bde0-4894-a5f8-f07207f1ba28-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.254314 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02abfc06-bde0-4894-a5f8-f07207f1ba28-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.276925 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.283148 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.288500 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.314848 5127 scope.go:117] "RemoveContainer" containerID="df507cd5ec54f237e6d768044e6d52556d60d036c40521c2cb898928ad478155" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.326679 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.379016 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.379216 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.396374 5127 scope.go:117] "RemoveContainer" containerID="f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.396617 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.400895 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-config-data" (OuterVolumeSpecName: "config-data") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.401344 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" (UID: "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.408125 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.409248 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-config-data" (OuterVolumeSpecName: "config-data") pod "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" (UID: "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.423857 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" (UID: "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.423952 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.425727 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "38d5ee07-f2ba-4a01-abab-aa8a58056a1b" (UID: "38d5ee07-f2ba-4a01-abab-aa8a58056a1b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.461791 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" (UID: "29d4b0b6-6bf1-466a-a0b5-dee3b16a533e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480067 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-str96\" (UniqueName: \"kubernetes.io/projected/7fbdc342-25af-4968-ad4c-5b294a488e39-kube-api-access-str96\") pod \"7fbdc342-25af-4968-ad4c-5b294a488e39\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480188 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fbdc342-25af-4968-ad4c-5b294a488e39-operator-scripts\") pod \"7fbdc342-25af-4968-ad4c-5b294a488e39\" (UID: \"7fbdc342-25af-4968-ad4c-5b294a488e39\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480695 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fbdc342-25af-4968-ad4c-5b294a488e39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fbdc342-25af-4968-ad4c-5b294a488e39" (UID: "7fbdc342-25af-4968-ad4c-5b294a488e39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: E0201 07:12:53.480786 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:53 crc kubenswrapper[5127]: E0201 07:12:53.480837 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data podName:824fc658-1c02-4470-9ed3-e4123ddd7575 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:55.480821115 +0000 UTC m=+1525.966723478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data") pod "rabbitmq-cell1-server-0" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575") : configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480939 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480959 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480969 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fbdc342-25af-4968-ad4c-5b294a488e39-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480980 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480988 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.480997 5127 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.481007 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.481016 5127 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.481025 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d5ee07-f2ba-4a01-abab-aa8a58056a1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.489188 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fbdc342-25af-4968-ad4c-5b294a488e39-kube-api-access-str96" (OuterVolumeSpecName: "kube-api-access-str96") pod "7fbdc342-25af-4968-ad4c-5b294a488e39" (UID: "7fbdc342-25af-4968-ad4c-5b294a488e39"). InnerVolumeSpecName "kube-api-access-str96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.525949 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.548732 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "02abfc06-bde0-4894-a5f8-f07207f1ba28" (UID: "02abfc06-bde0-4894-a5f8-f07207f1ba28"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.583003 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.583031 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-str96\" (UniqueName: \"kubernetes.io/projected/7fbdc342-25af-4968-ad4c-5b294a488e39-kube-api-access-str96\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.583042 5127 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02abfc06-bde0-4894-a5f8-f07207f1ba28-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.841759 5127 generic.go:334] "Generic (PLEG): container finished" podID="48898154-9be0-400f-8e0b-ef721132db71" containerID="45488eefbe618c6ed70968bb3a79848f397c02da3176113bc9124b98acb538e2" exitCode=0 Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.842434 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48898154-9be0-400f-8e0b-ef721132db71","Type":"ContainerDied","Data":"45488eefbe618c6ed70968bb3a79848f397c02da3176113bc9124b98acb538e2"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.842461 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48898154-9be0-400f-8e0b-ef721132db71","Type":"ContainerDied","Data":"2398a517b2bc7bf270cfec8577c2094869842f4a692025d3af41f561ab33fc3e"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.842472 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2398a517b2bc7bf270cfec8577c2094869842f4a692025d3af41f561ab33fc3e" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.844589 5127 generic.go:334] "Generic (PLEG): container finished" podID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerID="c359e7f7badca9f1e344c4f8fa4ef9cdc47db3820c690b533ab6b480f1e9cced" exitCode=1 Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.844640 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g7c4t" event={"ID":"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0","Type":"ContainerDied","Data":"c359e7f7badca9f1e344c4f8fa4ef9cdc47db3820c690b533ab6b480f1e9cced"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.845172 5127 scope.go:117] "RemoveContainer" containerID="c359e7f7badca9f1e344c4f8fa4ef9cdc47db3820c690b533ab6b480f1e9cced" Feb 01 07:12:53 crc kubenswrapper[5127]: E0201 07:12:53.845427 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-g7c4t_openstack(61a37fc0-b8b7-4bbc-ab43-2dae28037ee0)\"" pod="openstack/root-account-create-update-g7c4t" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.846330 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bb8a-account-create-update-xrpts" event={"ID":"a6a4a416-4347-4df8-80b1-edfa74abfe7e","Type":"ContainerDied","Data":"593cb714131d2679f5fa969f42633f7e745a972bc17536edae7e3acfde1f8fa0"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.846356 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593cb714131d2679f5fa969f42633f7e745a972bc17536edae7e3acfde1f8fa0" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.888483 5127 generic.go:334] "Generic (PLEG): container finished" podID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerID="967655f08a7adad63c2db4ecc270313cb506f4dbb7d2de93e05145f31cc59387" exitCode=143 Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.888571 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" event={"ID":"1330cbe6-a302-4ac6-89ec-b5f3b5791503","Type":"ContainerDied","Data":"967655f08a7adad63c2db4ecc270313cb506f4dbb7d2de93e05145f31cc59387"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.891623 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.903261 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:42814->10.217.0.209:8775: read: connection reset by peer" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.906721 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:42824->10.217.0.209:8775: read: connection reset by peer" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.907785 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.914904 5127 scope.go:117] "RemoveContainer" containerID="8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.916356 5127 generic.go:334] "Generic (PLEG): container finished" podID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerID="69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd" exitCode=0 Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.916502 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02abfc06-bde0-4894-a5f8-f07207f1ba28","Type":"ContainerDied","Data":"69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.916526 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02abfc06-bde0-4894-a5f8-f07207f1ba28","Type":"ContainerDied","Data":"dfc203b3e02671f2206836f97c4531db07136693a3d09e3a94605756f4f47dcf"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.916724 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.970251 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"29d4b0b6-6bf1-466a-a0b5-dee3b16a533e","Type":"ContainerDied","Data":"9f440cf398021049af247233fa14d7fb089fd75c537c5c0c29d78cbac3be31f0"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.970335 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989503 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data-custom\") pod \"48898154-9be0-400f-8e0b-ef721132db71\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989567 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-scripts\") pod \"48898154-9be0-400f-8e0b-ef721132db71\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989675 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a4a416-4347-4df8-80b1-edfa74abfe7e-operator-scripts\") pod \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989719 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data\") pod \"48898154-9be0-400f-8e0b-ef721132db71\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989745 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48898154-9be0-400f-8e0b-ef721132db71-etc-machine-id\") pod \"48898154-9be0-400f-8e0b-ef721132db71\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989763 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-combined-ca-bundle\") pod \"48898154-9be0-400f-8e0b-ef721132db71\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989809 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck44x\" (UniqueName: \"kubernetes.io/projected/48898154-9be0-400f-8e0b-ef721132db71-kube-api-access-ck44x\") pod \"48898154-9be0-400f-8e0b-ef721132db71\" (UID: \"48898154-9be0-400f-8e0b-ef721132db71\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.989920 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnnn5\" (UniqueName: \"kubernetes.io/projected/a6a4a416-4347-4df8-80b1-edfa74abfe7e-kube-api-access-hnnn5\") pod \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\" (UID: \"a6a4a416-4347-4df8-80b1-edfa74abfe7e\") " Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.990805 5127 generic.go:334] "Generic (PLEG): container finished" podID="cdafa63d-9b24-454c-a217-e53024719e75" containerID="c118e21ba6efeaa3a7ba640aedf062451fb2a67b8769dd72709cae85ff970c12" exitCode=0 Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.990886 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cdafa63d-9b24-454c-a217-e53024719e75","Type":"ContainerDied","Data":"c118e21ba6efeaa3a7ba640aedf062451fb2a67b8769dd72709cae85ff970c12"} Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.996324 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48898154-9be0-400f-8e0b-ef721132db71-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "48898154-9be0-400f-8e0b-ef721132db71" (UID: "48898154-9be0-400f-8e0b-ef721132db71"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:12:53 crc kubenswrapper[5127]: I0201 07:12:53.996440 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6a4a416-4347-4df8-80b1-edfa74abfe7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6a4a416-4347-4df8-80b1-edfa74abfe7e" (UID: "a6a4a416-4347-4df8-80b1-edfa74abfe7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:53.996544 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6a4a416-4347-4df8-80b1-edfa74abfe7e-kube-api-access-hnnn5" (OuterVolumeSpecName: "kube-api-access-hnnn5") pod "a6a4a416-4347-4df8-80b1-edfa74abfe7e" (UID: "a6a4a416-4347-4df8-80b1-edfa74abfe7e"). InnerVolumeSpecName "kube-api-access-hnnn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.001607 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-scripts" (OuterVolumeSpecName: "scripts") pod "48898154-9be0-400f-8e0b-ef721132db71" (UID: "48898154-9be0-400f-8e0b-ef721132db71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.004018 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" event={"ID":"38d5ee07-f2ba-4a01-abab-aa8a58056a1b","Type":"ContainerDied","Data":"a6fb044fc6a7651321acf6a9de2d7857d5c8415758dd2c063b3396dab494fcc5"} Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.004217 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f7d4bc459-g6tgf" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.012258 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48898154-9be0-400f-8e0b-ef721132db71" (UID: "48898154-9be0-400f-8e0b-ef721132db71"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.029649 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48898154-9be0-400f-8e0b-ef721132db71-kube-api-access-ck44x" (OuterVolumeSpecName: "kube-api-access-ck44x") pod "48898154-9be0-400f-8e0b-ef721132db71" (UID: "48898154-9be0-400f-8e0b-ef721132db71"). InnerVolumeSpecName "kube-api-access-ck44x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.055101 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4519-account-create-update-8tjgc" event={"ID":"7fbdc342-25af-4968-ad4c-5b294a488e39","Type":"ContainerDied","Data":"d3a0381066378f0d603d60e9b65a88af0f351afcf5930dba92e3f5dd8c69a4b4"} Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.055203 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4519-account-create-update-8tjgc" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.077924 5127 generic.go:334] "Generic (PLEG): container finished" podID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerID="bb0ddfad39e508e52c1255ed282f7f8d3226087a328247cb63a34d4e3ddca978" exitCode=0 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.078002 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d","Type":"ContainerDied","Data":"bb0ddfad39e508e52c1255ed282f7f8d3226087a328247cb63a34d4e3ddca978"} Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.094717 5127 generic.go:334] "Generic (PLEG): container finished" podID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerID="d23c240700660131c41788080e6a4a7bff561ffb7789deafac9dc5832ba94354" exitCode=143 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.094763 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5759588f57-nkg6k" event={"ID":"7ff7407e-28d1-4e89-829a-72a38dd882d7","Type":"ContainerDied","Data":"d23c240700660131c41788080e6a4a7bff561ffb7789deafac9dc5832ba94354"} Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.099375 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnnn5\" (UniqueName: \"kubernetes.io/projected/a6a4a416-4347-4df8-80b1-edfa74abfe7e-kube-api-access-hnnn5\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.099404 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.099419 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.099429 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6a4a416-4347-4df8-80b1-edfa74abfe7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.099439 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48898154-9be0-400f-8e0b-ef721132db71-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.099450 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck44x\" (UniqueName: \"kubernetes.io/projected/48898154-9be0-400f-8e0b-ef721132db71-kube-api-access-ck44x\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.145908 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.148784 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.186674 5127 scope.go:117] "RemoveContainer" containerID="f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.186775 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.187987 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1\": container with ID starting with f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1 not found: ID does not exist" containerID="f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.188039 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1"} err="failed to get container status \"f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1\": rpc error: code = NotFound desc = could not find container \"f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1\": container with ID starting with f8ce083b30d376e8ef1cedee9a20ffc2580f448738c77a4226c780ef64a8c6b1 not found: ID does not exist" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.188057 5127 scope.go:117] "RemoveContainer" containerID="8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.190467 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.190517 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="ovn-northd" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.195137 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7\": container with ID starting with 8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7 not found: ID does not exist" containerID="8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.195191 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7"} err="failed to get container status \"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7\": rpc error: code = NotFound desc = could not find container \"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7\": container with ID starting with 8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7 not found: ID does not exist" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.195218 5127 scope.go:117] "RemoveContainer" containerID="4acd4b5b4ff519a2d04a0bd77806acea282f43ce9562e95499145235dc585912" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224181 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-internal-tls-certs\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224230 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data-custom\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224301 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-combined-ca-bundle\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224366 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdafa63d-9b24-454c-a217-e53024719e75-etc-machine-id\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224381 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-public-tls-certs\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224430 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-scripts\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224448 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224463 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdafa63d-9b24-454c-a217-e53024719e75-logs\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224492 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q48fw\" (UniqueName: \"kubernetes.io/projected/cdafa63d-9b24-454c-a217-e53024719e75-kube-api-access-q48fw\") pod \"cdafa63d-9b24-454c-a217-e53024719e75\" (UID: \"cdafa63d-9b24-454c-a217-e53024719e75\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.224810 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdafa63d-9b24-454c-a217-e53024719e75-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.241625 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-scripts" (OuterVolumeSpecName: "scripts") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.242179 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-8tjgc"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.242232 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4519-account-create-update-8tjgc"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.256694 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.279848 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.280777 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdafa63d-9b24-454c-a217-e53024719e75-kube-api-access-q48fw" (OuterVolumeSpecName: "kube-api-access-q48fw") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "kube-api-access-q48fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.284843 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdafa63d-9b24-454c-a217-e53024719e75-logs" (OuterVolumeSpecName: "logs") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.303903 5127 scope.go:117] "RemoveContainer" containerID="6163d1004ca352f7cf607b6d84cfea1f9bd6b47b3c1ec1ac1b077bbb590d2580" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.305809 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" path="/var/lib/kubelet/pods/36a8a4ac-b308-4bb8-be43-dddca18b1bc1/volumes" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.306798 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" path="/var/lib/kubelet/pods/41cdcc7d-28e4-4f12-a2b2-5052f7872c1a/volumes" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.307432 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0be460-5699-4787-9c9e-90df6400faed" path="/var/lib/kubelet/pods/4b0be460-5699-4787-9c9e-90df6400faed/volumes" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.308535 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fbdc342-25af-4968-ad4c-5b294a488e39" path="/var/lib/kubelet/pods/7fbdc342-25af-4968-ad4c-5b294a488e39/volumes" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.314946 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15e38c1-f8c8-4e6c-9e52-1b39e952017d" path="/var/lib/kubelet/pods/a15e38c1-f8c8-4e6c-9e52-1b39e952017d/volumes" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.318612 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" path="/var/lib/kubelet/pods/d523dcf2-c3fd-4473-ae9b-27e64a77205d/volumes" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.325741 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-combined-ca-bundle\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.325778 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lflpb\" (UniqueName: \"kubernetes.io/projected/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-kube-api-access-lflpb\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.325799 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-internal-tls-certs\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.325881 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-httpd-run\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.325903 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-logs\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.325920 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-config-data\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.326006 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-scripts\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.326026 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\" (UID: \"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d\") " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.326673 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cdafa63d-9b24-454c-a217-e53024719e75-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.326686 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.326694 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdafa63d-9b24-454c-a217-e53024719e75-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.326703 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q48fw\" (UniqueName: \"kubernetes.io/projected/cdafa63d-9b24-454c-a217-e53024719e75-kube-api-access-q48fw\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.326712 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.327662 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48898154-9be0-400f-8e0b-ef721132db71" (UID: "48898154-9be0-400f-8e0b-ef721132db71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.328280 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.328513 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-logs" (OuterVolumeSpecName: "logs") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.344915 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.344953 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.344970 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.344981 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.344994 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f7d4bc459-g6tgf"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.345003 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7f7d4bc459-g6tgf"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.345015 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.345294 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-central-agent" containerID="cri-o://1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b" gracePeriod=30 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.345706 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="proxy-httpd" containerID="cri-o://7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49" gracePeriod=30 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.345762 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="sg-core" containerID="cri-o://c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357" gracePeriod=30 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.345793 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-notification-agent" containerID="cri-o://271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e" gracePeriod=30 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.354745 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-kube-api-access-lflpb" (OuterVolumeSpecName: "kube-api-access-lflpb") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "kube-api-access-lflpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.358200 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-scripts" (OuterVolumeSpecName: "scripts") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.364819 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.365562 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.369518 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.373364 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.379875 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="fd0a3f5a-2119-403c-8b4c-e452465a71e8" containerName="kube-state-metrics" containerID="cri-o://2c5095bb5c19bb3463f1baf571331379f62fe6f5cfacef8ce4ddbb8ec37e07f6" gracePeriod=30 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.412073 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.412767 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.206:3000/\": read tcp 10.217.0.2:37542->10.217.0.206:3000: read: connection reset by peer" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.412828 5127 scope.go:117] "RemoveContainer" containerID="69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.427744 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429013 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429038 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429062 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429072 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429081 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429089 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429100 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lflpb\" (UniqueName: \"kubernetes.io/projected/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-kube-api-access-lflpb\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429109 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429117 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.429126 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.443992 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b2b-account-create-update-s4768"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.453421 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9b2b-account-create-update-s4768"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.461151 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data" (OuterVolumeSpecName: "config-data") pod "cdafa63d-9b24-454c-a217-e53024719e75" (UID: "cdafa63d-9b24-454c-a217-e53024719e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.467743 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.559662 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9b2b-account-create-update-bcddr"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560273 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560285 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560295 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="cinder-scheduler" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560301 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="cinder-scheduler" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560316 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-httpd" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560322 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-httpd" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560332 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-server" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560338 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-server" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560348 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="probe" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560354 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="probe" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560361 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a9fd1d-985f-497f-9b8e-773013dc8747" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560367 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a9fd1d-985f-497f-9b8e-773013dc8747" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560375 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-log" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560380 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-log" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560392 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerName="dnsmasq-dns" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560397 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerName="dnsmasq-dns" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560409 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerName="mysql-bootstrap" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560416 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerName="mysql-bootstrap" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560424 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560429 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560439 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerName="init" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560445 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerName="init" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560455 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="ovsdbserver-nb" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560461 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="ovsdbserver-nb" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560470 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-httpd" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560476 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-httpd" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560487 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerName="galera" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560493 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerName="galera" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560501 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="ovsdbserver-sb" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560506 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="ovsdbserver-sb" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560515 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560520 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560532 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560538 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560549 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560554 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.560566 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api-log" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560572 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api-log" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560746 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="ovsdbserver-nb" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560764 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-httpd" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560772 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" containerName="galera" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560783 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="cinder-scheduler" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560792 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a9fd1d-985f-497f-9b8e-773013dc8747" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560804 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="48898154-9be0-400f-8e0b-ef721132db71" containerName="probe" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560816 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-log" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560824 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" containerName="proxy-server" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560839 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" containerName="glance-httpd" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560847 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="ovsdbserver-sb" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560858 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560867 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d523dcf2-c3fd-4473-ae9b-27e64a77205d" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560874 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="36a8a4ac-b308-4bb8-be43-dddca18b1bc1" containerName="openstack-network-exporter" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560881 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560890 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0be460-5699-4787-9c9e-90df6400faed" containerName="ovn-controller" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560899 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="41cdcc7d-28e4-4f12-a2b2-5052f7872c1a" containerName="dnsmasq-dns" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.560910 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdafa63d-9b24-454c-a217-e53024719e75" containerName="cinder-api-log" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.561454 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.567404 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.584090 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="fd0a3f5a-2119-403c-8b4c-e452465a71e8" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.190:8081/readyz\": dial tcp 10.217.0.190:8081: connect: connection refused" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.598043 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9b2b-account-create-update-bcddr"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.599069 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdafa63d-9b24-454c-a217-e53024719e75-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.599098 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.627710 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.631572 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-v4dp5"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.639749 5127 scope.go:117] "RemoveContainer" containerID="f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.678691 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-z8dx7"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.686811 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data" (OuterVolumeSpecName: "config-data") pod "48898154-9be0-400f-8e0b-ef721132db71" (UID: "48898154-9be0-400f-8e0b-ef721132db71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.695684 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-v4dp5"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.701452 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.701486 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprn8\" (UniqueName: \"kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.701557 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.701567 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48898154-9be0-400f-8e0b-ef721132db71-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.701628 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-z8dx7"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.703573 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-config-data" (OuterVolumeSpecName: "config-data") pod "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" (UID: "a9187249-9aa3-4b9e-a7db-47d95e5c4f6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.706339 5127 scope.go:117] "RemoveContainer" containerID="69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.706905 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd\": container with ID starting with 69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd not found: ID does not exist" containerID="69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.706948 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd"} err="failed to get container status \"69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd\": rpc error: code = NotFound desc = could not find container \"69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd\": container with ID starting with 69d5fe1dc5625b50c807a966718267132b6c59c9301253e4753aa9f34e2b08bd not found: ID does not exist" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.707008 5127 scope.go:117] "RemoveContainer" containerID="f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.707506 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607\": container with ID starting with f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607 not found: ID does not exist" containerID="f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.707534 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607"} err="failed to get container status \"f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607\": rpc error: code = NotFound desc = could not find container \"f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607\": container with ID starting with f6105d5a06ec8adbddf55acbf24e199ec9994524ed89b4f357812a000810d607 not found: ID does not exist" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.707553 5127 scope.go:117] "RemoveContainer" containerID="59ccaa8d2bba84519b6c4dd0057f50e45fb9b19f2e881b045bcfc6bbe203275b" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.707506 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.771559 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5fdd8b75cb-lhmbf"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.772525 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-5fdd8b75cb-lhmbf" podUID="adddcef2-e42a-4f9c-a1c9-08b8253e7616" containerName="keystone-api" containerID="cri-o://40b690d53e4e14c7eb51d61afb6d0b0437739a3d5946e3586f5c4d6026b0819a" gracePeriod=30 Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.800686 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-f4qps"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.802918 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.802948 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprn8\" (UniqueName: \"kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.803020 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.803068 5127 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.803124 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts podName:d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:55.303105393 +0000 UTC m=+1525.789007756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts") pod "keystone-9b2b-account-create-update-bcddr" (UID: "d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50") : configmap "openstack-scripts" not found Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.805906 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b2b-account-create-update-bcddr"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.806482 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cprn8 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-9b2b-account-create-update-bcddr" podUID="d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.809105 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cprn8 for pod openstack/keystone-9b2b-account-create-update-bcddr: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.809167 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8 podName:d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:55.309139426 +0000 UTC m=+1525.795041789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cprn8" (UniqueName: "kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8") pod "keystone-9b2b-account-create-update-bcddr" (UID: "d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.815019 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.831048 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-f4qps"] Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.839634 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.857025 5127 scope.go:117] "RemoveContainer" containerID="8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.857874 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7"} err="failed to get container status \"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7\": rpc error: code = NotFound desc = could not find container \"8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7\": container with ID starting with 8563ff41d434131affb4b553163c2e22fac6b19fe237724934f0acfc162953a7 not found: ID does not exist" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.868233 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-g7c4t"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.941675 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.942236 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.942680 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.942707 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.944567 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.946334 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.947216 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:12:54 crc kubenswrapper[5127]: E0201 07:12:54.947246 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:12:54 crc kubenswrapper[5127]: I0201 07:12:54.978895 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005064 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-config-data\") pod \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005338 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-combined-ca-bundle\") pod \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005378 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hkfb\" (UniqueName: \"kubernetes.io/projected/aed0e157-f34a-4343-ae3b-71e045eb4cf4-kube-api-access-6hkfb\") pod \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005470 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-internal-tls-certs\") pod \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005494 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-scripts\") pod \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005517 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed0e157-f34a-4343-ae3b-71e045eb4cf4-logs\") pod \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005594 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbzch\" (UniqueName: \"kubernetes.io/projected/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-kube-api-access-lbzch\") pod \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005614 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-nova-metadata-tls-certs\") pod \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\" (UID: \"aed0e157-f34a-4343-ae3b-71e045eb4cf4\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005674 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-combined-ca-bundle\") pod \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005711 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-logs\") pod \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005731 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-public-tls-certs\") pod \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.005759 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-config-data\") pod \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\" (UID: \"79f921c6-ec0a-46f5-b3c3-5d479690d0e5\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.007471 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aed0e157-f34a-4343-ae3b-71e045eb4cf4-logs" (OuterVolumeSpecName: "logs") pod "aed0e157-f34a-4343-ae3b-71e045eb4cf4" (UID: "aed0e157-f34a-4343-ae3b-71e045eb4cf4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.009702 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-logs" (OuterVolumeSpecName: "logs") pod "79f921c6-ec0a-46f5-b3c3-5d479690d0e5" (UID: "79f921c6-ec0a-46f5-b3c3-5d479690d0e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.015770 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-kube-api-access-lbzch" (OuterVolumeSpecName: "kube-api-access-lbzch") pod "79f921c6-ec0a-46f5-b3c3-5d479690d0e5" (UID: "79f921c6-ec0a-46f5-b3c3-5d479690d0e5"). InnerVolumeSpecName "kube-api-access-lbzch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.020962 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed0e157-f34a-4343-ae3b-71e045eb4cf4-kube-api-access-6hkfb" (OuterVolumeSpecName: "kube-api-access-6hkfb") pod "aed0e157-f34a-4343-ae3b-71e045eb4cf4" (UID: "aed0e157-f34a-4343-ae3b-71e045eb4cf4"). InnerVolumeSpecName "kube-api-access-6hkfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.026537 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerName="galera" containerID="cri-o://eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" gracePeriod=30 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.032284 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-scripts" (OuterVolumeSpecName: "scripts") pod "79f921c6-ec0a-46f5-b3c3-5d479690d0e5" (UID: "79f921c6-ec0a-46f5-b3c3-5d479690d0e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.099174 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-config-data" (OuterVolumeSpecName: "config-data") pod "aed0e157-f34a-4343-ae3b-71e045eb4cf4" (UID: "aed0e157-f34a-4343-ae3b-71e045eb4cf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.110264 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-public-tls-certs\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.110488 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.110672 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-httpd-run\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.110786 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-combined-ca-bundle\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.110900 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-scripts\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.110982 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn288\" (UniqueName: \"kubernetes.io/projected/4d6754e0-125e-446b-8ef2-fc58883f6c76-kube-api-access-sn288\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111122 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-logs\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111199 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-config-data\") pod \"4d6754e0-125e-446b-8ef2-fc58883f6c76\" (UID: \"4d6754e0-125e-446b-8ef2-fc58883f6c76\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111632 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111695 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hkfb\" (UniqueName: \"kubernetes.io/projected/aed0e157-f34a-4343-ae3b-71e045eb4cf4-kube-api-access-6hkfb\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111756 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111809 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aed0e157-f34a-4343-ae3b-71e045eb4cf4-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111858 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbzch\" (UniqueName: \"kubernetes.io/projected/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-kube-api-access-lbzch\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.111908 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.117381 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-config-data" (OuterVolumeSpecName: "config-data") pod "79f921c6-ec0a-46f5-b3c3-5d479690d0e5" (UID: "79f921c6-ec0a-46f5-b3c3-5d479690d0e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.117969 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-logs" (OuterVolumeSpecName: "logs") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.118777 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.125559 5127 generic.go:334] "Generic (PLEG): container finished" podID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerID="7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.125599 5127 generic.go:334] "Generic (PLEG): container finished" podID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerID="c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357" exitCode=2 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.125608 5127 generic.go:334] "Generic (PLEG): container finished" podID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerID="1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.125646 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerDied","Data":"7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.125671 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerDied","Data":"c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.125682 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerDied","Data":"1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.128052 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerID="fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.128098 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d6754e0-125e-446b-8ef2-fc58883f6c76","Type":"ContainerDied","Data":"fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.128119 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4d6754e0-125e-446b-8ef2-fc58883f6c76","Type":"ContainerDied","Data":"198654e0dfa52a31b4265533d3dddf843d18338e387b896924c598a1c48f6e8e"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.128139 5127 scope.go:117] "RemoveContainer" containerID="fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.128236 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.141275 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.145912 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-scripts" (OuterVolumeSpecName: "scripts") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.158699 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aed0e157-f34a-4343-ae3b-71e045eb4cf4" (UID: "aed0e157-f34a-4343-ae3b-71e045eb4cf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.170470 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6754e0-125e-446b-8ef2-fc58883f6c76-kube-api-access-sn288" (OuterVolumeSpecName: "kube-api-access-sn288") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "kube-api-access-sn288". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.171366 5127 generic.go:334] "Generic (PLEG): container finished" podID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerID="bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.172217 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6fbd756774-8bz24" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.172250 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fbd756774-8bz24" event={"ID":"79f921c6-ec0a-46f5-b3c3-5d479690d0e5","Type":"ContainerDied","Data":"bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.182196 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6fbd756774-8bz24" event={"ID":"79f921c6-ec0a-46f5-b3c3-5d479690d0e5","Type":"ContainerDied","Data":"a7789c4cd775313f628a5126de285bf080b0610efdda813fd7d4bd315bc73b60"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.208023 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.208559 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "aed0e157-f34a-4343-ae3b-71e045eb4cf4" (UID: "aed0e157-f34a-4343-ae3b-71e045eb4cf4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214279 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214312 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214326 5127 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214335 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214344 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn288\" (UniqueName: \"kubernetes.io/projected/4d6754e0-125e-446b-8ef2-fc58883f6c76-kube-api-access-sn288\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214358 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d6754e0-125e-446b-8ef2-fc58883f6c76-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214367 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214374 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aed0e157-f34a-4343-ae3b-71e045eb4cf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.214394 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.215386 5127 generic.go:334] "Generic (PLEG): container finished" podID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerID="29d8d027dbe06246751c1b56e85016b77f2dd4ca87ded166e55fa2c4832c64ec" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.215443 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" event={"ID":"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a","Type":"ContainerDied","Data":"29d8d027dbe06246751c1b56e85016b77f2dd4ca87ded166e55fa2c4832c64ec"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.220058 5127 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-g7c4t" secret="" err="secret \"galera-openstack-dockercfg-ccz6b\" not found" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.220094 5127 scope.go:117] "RemoveContainer" containerID="c359e7f7badca9f1e344c4f8fa4ef9cdc47db3820c690b533ab6b480f1e9cced" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.220436 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-g7c4t_openstack(61a37fc0-b8b7-4bbc-ab43-2dae28037ee0)\"" pod="openstack/root-account-create-update-g7c4t" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.218086 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-config-data" (OuterVolumeSpecName: "config-data") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.237957 5127 generic.go:334] "Generic (PLEG): container finished" podID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerID="c29366b00ecfb7dffff5a9a80692040e245c0a01c1fbbaf5d4d101f0738c006c" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.238152 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" event={"ID":"b8a6e525-1342-4031-8c3d-5920b8016c8e","Type":"ContainerDied","Data":"c29366b00ecfb7dffff5a9a80692040e245c0a01c1fbbaf5d4d101f0738c006c"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.257187 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a9187249-9aa3-4b9e-a7db-47d95e5c4f6d","Type":"ContainerDied","Data":"c3bab97ff69b9f5dfb89d459d2cda1d227b4e8b94b7a20cce9ba0a3bb9e3b52d"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.257252 5127 scope.go:117] "RemoveContainer" containerID="9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.257382 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.341625 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.341681 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprn8\" (UniqueName: \"kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.341807 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.344977 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cprn8 for pod openstack/keystone-9b2b-account-create-update-bcddr: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.345248 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.349786 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "79f921c6-ec0a-46f5-b3c3-5d479690d0e5" (UID: "79f921c6-ec0a-46f5-b3c3-5d479690d0e5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.349977 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8 podName:d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:56.3499505 +0000 UTC m=+1526.835852863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cprn8" (UniqueName: "kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8") pod "keystone-9b2b-account-create-update-bcddr" (UID: "d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.350292 5127 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.350333 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts podName:61a37fc0-b8b7-4bbc-ab43-2dae28037ee0 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:55.85032297 +0000 UTC m=+1526.336225323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts") pod "root-account-create-update-g7c4t" (UID: "61a37fc0-b8b7-4bbc-ab43-2dae28037ee0") : configmap "openstack-scripts" not found Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.350764 5127 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.350800 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts podName:d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:56.350792983 +0000 UTC m=+1526.836695346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts") pod "keystone-9b2b-account-create-update-bcddr" (UID: "d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50") : configmap "openstack-scripts" not found Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.353857 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79f921c6-ec0a-46f5-b3c3-5d479690d0e5" (UID: "79f921c6-ec0a-46f5-b3c3-5d479690d0e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.354353 5127 generic.go:334] "Generic (PLEG): container finished" podID="472be6e7-d046-4377-b055-50828b00b8cd" containerID="62c7e7aeed632c501e98dba48dbb0ca73647880b2adcc2c89f331f126d30002a" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.354419 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6969499d9b-sjxsr" event={"ID":"472be6e7-d046-4377-b055-50828b00b8cd","Type":"ContainerDied","Data":"62c7e7aeed632c501e98dba48dbb0ca73647880b2adcc2c89f331f126d30002a"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.367832 5127 generic.go:334] "Generic (PLEG): container finished" podID="fd0a3f5a-2119-403c-8b4c-e452465a71e8" containerID="2c5095bb5c19bb3463f1baf571331379f62fe6f5cfacef8ce4ddbb8ec37e07f6" exitCode=2 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.367923 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd0a3f5a-2119-403c-8b4c-e452465a71e8","Type":"ContainerDied","Data":"2c5095bb5c19bb3463f1baf571331379f62fe6f5cfacef8ce4ddbb8ec37e07f6"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.372246 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cdafa63d-9b24-454c-a217-e53024719e75","Type":"ContainerDied","Data":"58b78ac0e023f1b9d6bbc86bfb115def8590217eba11dbc9a8f50c2c2e5076d3"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.372383 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.447162 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.447187 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.447196 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.471810 5127 scope.go:117] "RemoveContainer" containerID="fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.472056 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "79f921c6-ec0a-46f5-b3c3-5d479690d0e5" (UID: "79f921c6-ec0a-46f5-b3c3-5d479690d0e5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.473837 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183\": container with ID starting with fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183 not found: ID does not exist" containerID="fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.473878 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183"} err="failed to get container status \"fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183\": rpc error: code = NotFound desc = could not find container \"fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183\": container with ID starting with fe9175b3ecbc5cd866a2ac2efa9f7548c756f3536c64fc0905e1884fd8d0e183 not found: ID does not exist" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.473899 5127 scope.go:117] "RemoveContainer" containerID="9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.474736 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf\": container with ID starting with 9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf not found: ID does not exist" containerID="9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.474761 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf"} err="failed to get container status \"9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf\": rpc error: code = NotFound desc = could not find container \"9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf\": container with ID starting with 9d0ee4988dac854e81309aaad18721dac4d3cdf3cbc0f87813de12d3bb4266bf not found: ID does not exist" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.474776 5127 scope.go:117] "RemoveContainer" containerID="bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.475418 5127 generic.go:334] "Generic (PLEG): container finished" podID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerID="f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277" exitCode=0 Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.475540 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.475946 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed0e157-f34a-4343-ae3b-71e045eb4cf4","Type":"ContainerDied","Data":"f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.475989 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aed0e157-f34a-4343-ae3b-71e045eb4cf4","Type":"ContainerDied","Data":"053e414d70c59a0901ee3b9b4b7930d0344dd5350a87e82274d651eb5343cab4"} Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.476087 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.476845 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.477008 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bb8a-account-create-update-xrpts" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.526565 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.541084 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4d6754e0-125e-446b-8ef2-fc58883f6c76" (UID: "4d6754e0-125e-446b-8ef2-fc58883f6c76"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.562735 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6754e0-125e-446b-8ef2-fc58883f6c76-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.562766 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/79f921c6-ec0a-46f5-b3c3-5d479690d0e5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.562874 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.563923 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data podName:824fc658-1c02-4470-9ed3-e4123ddd7575 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:59.562906109 +0000 UTC m=+1530.048808462 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data") pod "rabbitmq-cell1-server-0" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575") : configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.568395 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.593546 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.628639 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.647178 5127 scope.go:117] "RemoveContainer" containerID="280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.664143 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-config\") pod \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.664295 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-certs\") pod \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.664359 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjsbn\" (UniqueName: \"kubernetes.io/projected/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-api-access-pjsbn\") pod \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.664393 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-combined-ca-bundle\") pod \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\" (UID: \"fd0a3f5a-2119-403c-8b4c-e452465a71e8\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.672272 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-api-access-pjsbn" (OuterVolumeSpecName: "kube-api-access-pjsbn") pod "fd0a3f5a-2119-403c-8b4c-e452465a71e8" (UID: "fd0a3f5a-2119-403c-8b4c-e452465a71e8"). InnerVolumeSpecName "kube-api-access-pjsbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.691259 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd0a3f5a-2119-403c-8b4c-e452465a71e8" (UID: "fd0a3f5a-2119-403c-8b4c-e452465a71e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.708693 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "fd0a3f5a-2119-403c-8b4c-e452465a71e8" (UID: "fd0a3f5a-2119-403c-8b4c-e452465a71e8"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.743734 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "fd0a3f5a-2119-403c-8b4c-e452465a71e8" (UID: "fd0a3f5a-2119-403c-8b4c-e452465a71e8"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.746909 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.766112 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data\") pod \"472be6e7-d046-4377-b055-50828b00b8cd\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.767694 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472be6e7-d046-4377-b055-50828b00b8cd-logs\") pod \"472be6e7-d046-4377-b055-50828b00b8cd\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.767867 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data-custom\") pod \"472be6e7-d046-4377-b055-50828b00b8cd\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.768045 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-internal-tls-certs\") pod \"472be6e7-d046-4377-b055-50828b00b8cd\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.770993 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbmdv\" (UniqueName: \"kubernetes.io/projected/472be6e7-d046-4377-b055-50828b00b8cd-kube-api-access-zbmdv\") pod \"472be6e7-d046-4377-b055-50828b00b8cd\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.771148 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-combined-ca-bundle\") pod \"472be6e7-d046-4377-b055-50828b00b8cd\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.771196 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/472be6e7-d046-4377-b055-50828b00b8cd-logs" (OuterVolumeSpecName: "logs") pod "472be6e7-d046-4377-b055-50828b00b8cd" (UID: "472be6e7-d046-4377-b055-50828b00b8cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.771300 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-public-tls-certs\") pod \"472be6e7-d046-4377-b055-50828b00b8cd\" (UID: \"472be6e7-d046-4377-b055-50828b00b8cd\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.771926 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/472be6e7-d046-4377-b055-50828b00b8cd-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.773157 5127 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.773232 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjsbn\" (UniqueName: \"kubernetes.io/projected/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-api-access-pjsbn\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.773284 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.773387 5127 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fd0a3f5a-2119-403c-8b4c-e452465a71e8-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.779497 5127 scope.go:117] "RemoveContainer" containerID="bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.781339 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5\": container with ID starting with bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5 not found: ID does not exist" containerID="bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.781386 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5"} err="failed to get container status \"bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5\": rpc error: code = NotFound desc = could not find container \"bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5\": container with ID starting with bac6fc09863bea01efc2f02604b6e628fbea0a03fba9a16f237fce8630ffa5c5 not found: ID does not exist" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.781414 5127 scope.go:117] "RemoveContainer" containerID="280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.782224 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f\": container with ID starting with 280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f not found: ID does not exist" containerID="280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.782255 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f"} err="failed to get container status \"280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f\": rpc error: code = NotFound desc = could not find container \"280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f\": container with ID starting with 280a0820967478ee49cd4b2396972514462bde8c5093ad41e64e748f7af1aa8f not found: ID does not exist" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.782276 5127 scope.go:117] "RemoveContainer" containerID="bb0ddfad39e508e52c1255ed282f7f8d3226087a328247cb63a34d4e3ddca978" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.782356 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.796691 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.797606 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.802682 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/472be6e7-d046-4377-b055-50828b00b8cd-kube-api-access-zbmdv" (OuterVolumeSpecName: "kube-api-access-zbmdv") pod "472be6e7-d046-4377-b055-50828b00b8cd" (UID: "472be6e7-d046-4377-b055-50828b00b8cd"). InnerVolumeSpecName "kube-api-access-zbmdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.803931 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "472be6e7-d046-4377-b055-50828b00b8cd" (UID: "472be6e7-d046-4377-b055-50828b00b8cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.827019 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "472be6e7-d046-4377-b055-50828b00b8cd" (UID: "472be6e7-d046-4377-b055-50828b00b8cd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.837201 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "472be6e7-d046-4377-b055-50828b00b8cd" (UID: "472be6e7-d046-4377-b055-50828b00b8cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.839892 5127 scope.go:117] "RemoveContainer" containerID="6c3858c14ef4c311b1deda9d45684f86e030100946c594b504545c60e4d6512d" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.848653 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data" (OuterVolumeSpecName: "config-data") pod "472be6e7-d046-4377-b055-50828b00b8cd" (UID: "472be6e7-d046-4377-b055-50828b00b8cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.848706 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-xrpts"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.854075 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "472be6e7-d046-4377-b055-50828b00b8cd" (UID: "472be6e7-d046-4377-b055-50828b00b8cd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.866805 5127 scope.go:117] "RemoveContainer" containerID="c118e21ba6efeaa3a7ba640aedf062451fb2a67b8769dd72709cae85ff970c12" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874099 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a6e525-1342-4031-8c3d-5920b8016c8e-logs\") pod \"b8a6e525-1342-4031-8c3d-5920b8016c8e\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874144 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s68jl\" (UniqueName: \"kubernetes.io/projected/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-kube-api-access-s68jl\") pod \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874167 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data\") pod \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874228 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qclqx\" (UniqueName: \"kubernetes.io/projected/b8a6e525-1342-4031-8c3d-5920b8016c8e-kube-api-access-qclqx\") pod \"b8a6e525-1342-4031-8c3d-5920b8016c8e\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874246 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-combined-ca-bundle\") pod \"b8a6e525-1342-4031-8c3d-5920b8016c8e\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874263 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data-custom\") pod \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874287 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-logs\") pod \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874382 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data\") pod \"b8a6e525-1342-4031-8c3d-5920b8016c8e\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874455 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-combined-ca-bundle\") pod \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\" (UID: \"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874477 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data-custom\") pod \"b8a6e525-1342-4031-8c3d-5920b8016c8e\" (UID: \"b8a6e525-1342-4031-8c3d-5920b8016c8e\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.874568 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a6e525-1342-4031-8c3d-5920b8016c8e-logs" (OuterVolumeSpecName: "logs") pod "b8a6e525-1342-4031-8c3d-5920b8016c8e" (UID: "b8a6e525-1342-4031-8c3d-5920b8016c8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.878923 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.879227 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8a6e525-1342-4031-8c3d-5920b8016c8e-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.879308 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.879365 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbmdv\" (UniqueName: \"kubernetes.io/projected/472be6e7-d046-4377-b055-50828b00b8cd-kube-api-access-zbmdv\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.879487 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.879562 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.879642 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/472be6e7-d046-4377-b055-50828b00b8cd-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.879802 5127 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 01 07:12:55 crc kubenswrapper[5127]: E0201 07:12:55.879944 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts podName:61a37fc0-b8b7-4bbc-ab43-2dae28037ee0 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:56.879926892 +0000 UTC m=+1527.365829265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts") pod "root-account-create-update-g7c4t" (UID: "61a37fc0-b8b7-4bbc-ab43-2dae28037ee0") : configmap "openstack-scripts" not found Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.881718 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a6e525-1342-4031-8c3d-5920b8016c8e-kube-api-access-qclqx" (OuterVolumeSpecName: "kube-api-access-qclqx") pod "b8a6e525-1342-4031-8c3d-5920b8016c8e" (UID: "b8a6e525-1342-4031-8c3d-5920b8016c8e"). InnerVolumeSpecName "kube-api-access-qclqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.885624 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-bb8a-account-create-update-xrpts"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.885945 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-logs" (OuterVolumeSpecName: "logs") pod "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" (UID: "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.890369 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" (UID: "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.891248 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-kube-api-access-s68jl" (OuterVolumeSpecName: "kube-api-access-s68jl") pod "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" (UID: "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a"). InnerVolumeSpecName "kube-api-access-s68jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.893873 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.900823 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.903403 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b8a6e525-1342-4031-8c3d-5920b8016c8e" (UID: "b8a6e525-1342-4031-8c3d-5920b8016c8e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.904106 5127 scope.go:117] "RemoveContainer" containerID="3ef4cc3b13aaf195c0e9ab17d2d878bd41f2a1d4e67c807b8411510d47ddce71" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.910909 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.926680 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.934864 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.941478 5127 scope.go:117] "RemoveContainer" containerID="f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.945480 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" (UID: "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.956765 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data" (OuterVolumeSpecName: "config-data") pod "b8a6e525-1342-4031-8c3d-5920b8016c8e" (UID: "b8a6e525-1342-4031-8c3d-5920b8016c8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.960633 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.968377 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data" (OuterVolumeSpecName: "config-data") pod "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" (UID: "f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.971844 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8a6e525-1342-4031-8c3d-5920b8016c8e" (UID: "b8a6e525-1342-4031-8c3d-5920b8016c8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.980526 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-config-data\") pod \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.980724 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-combined-ca-bundle\") pod \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.980805 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tcff\" (UniqueName: \"kubernetes.io/projected/644a363d-bd2b-4cb5-81bf-05f7514d7abe-kube-api-access-7tcff\") pod \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\" (UID: \"644a363d-bd2b-4cb5-81bf-05f7514d7abe\") " Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981172 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s68jl\" (UniqueName: \"kubernetes.io/projected/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-kube-api-access-s68jl\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981192 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981205 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qclqx\" (UniqueName: \"kubernetes.io/projected/b8a6e525-1342-4031-8c3d-5920b8016c8e-kube-api-access-qclqx\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981218 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981259 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981271 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981281 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981292 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.981301 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b8a6e525-1342-4031-8c3d-5920b8016c8e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.984941 5127 scope.go:117] "RemoveContainer" containerID="fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75" Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.986683 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.992498 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6fbd756774-8bz24"] Feb 01 07:12:55 crc kubenswrapper[5127]: I0201 07:12:55.994724 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644a363d-bd2b-4cb5-81bf-05f7514d7abe-kube-api-access-7tcff" (OuterVolumeSpecName: "kube-api-access-7tcff") pod "644a363d-bd2b-4cb5-81bf-05f7514d7abe" (UID: "644a363d-bd2b-4cb5-81bf-05f7514d7abe"). InnerVolumeSpecName "kube-api-access-7tcff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.002319 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "644a363d-bd2b-4cb5-81bf-05f7514d7abe" (UID: "644a363d-bd2b-4cb5-81bf-05f7514d7abe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.010703 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6fbd756774-8bz24"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.013343 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-config-data" (OuterVolumeSpecName: "config-data") pod "644a363d-bd2b-4cb5-81bf-05f7514d7abe" (UID: "644a363d-bd2b-4cb5-81bf-05f7514d7abe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.014266 5127 scope.go:117] "RemoveContainer" containerID="f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.014786 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277\": container with ID starting with f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277 not found: ID does not exist" containerID="f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.014827 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277"} err="failed to get container status \"f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277\": rpc error: code = NotFound desc = could not find container \"f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277\": container with ID starting with f50d10327e9375dab741c8775f9dbdc86edc7958b00c7eda540155099b262277 not found: ID does not exist" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.014855 5127 scope.go:117] "RemoveContainer" containerID="fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.016019 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75\": container with ID starting with fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75 not found: ID does not exist" containerID="fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.016060 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75"} err="failed to get container status \"fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75\": rpc error: code = NotFound desc = could not find container \"fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75\": container with ID starting with fe9de4b95e0d4d602a54f3da862c292f529c42bc96c17d93efc1a4676f2fae75 not found: ID does not exist" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.025752 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.032275 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.082911 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tcff\" (UniqueName: \"kubernetes.io/projected/644a363d-bd2b-4cb5-81bf-05f7514d7abe-kube-api-access-7tcff\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.082952 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.082967 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/644a363d-bd2b-4cb5-81bf-05f7514d7abe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.094821 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.184297 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-combined-ca-bundle\") pod \"f85085ef-a23e-41f4-8839-08915aaaef7e\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.184500 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nf8m\" (UniqueName: \"kubernetes.io/projected/f85085ef-a23e-41f4-8839-08915aaaef7e-kube-api-access-9nf8m\") pod \"f85085ef-a23e-41f4-8839-08915aaaef7e\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.184537 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-internal-tls-certs\") pod \"f85085ef-a23e-41f4-8839-08915aaaef7e\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.184578 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-config-data\") pod \"f85085ef-a23e-41f4-8839-08915aaaef7e\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.184662 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85085ef-a23e-41f4-8839-08915aaaef7e-logs\") pod \"f85085ef-a23e-41f4-8839-08915aaaef7e\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.184684 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-public-tls-certs\") pod \"f85085ef-a23e-41f4-8839-08915aaaef7e\" (UID: \"f85085ef-a23e-41f4-8839-08915aaaef7e\") " Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.187856 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85085ef-a23e-41f4-8839-08915aaaef7e-logs" (OuterVolumeSpecName: "logs") pod "f85085ef-a23e-41f4-8839-08915aaaef7e" (UID: "f85085ef-a23e-41f4-8839-08915aaaef7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.189295 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85085ef-a23e-41f4-8839-08915aaaef7e-kube-api-access-9nf8m" (OuterVolumeSpecName: "kube-api-access-9nf8m") pod "f85085ef-a23e-41f4-8839-08915aaaef7e" (UID: "f85085ef-a23e-41f4-8839-08915aaaef7e"). InnerVolumeSpecName "kube-api-access-9nf8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.205921 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f85085ef-a23e-41f4-8839-08915aaaef7e" (UID: "f85085ef-a23e-41f4-8839-08915aaaef7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.222484 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-config-data" (OuterVolumeSpecName: "config-data") pod "f85085ef-a23e-41f4-8839-08915aaaef7e" (UID: "f85085ef-a23e-41f4-8839-08915aaaef7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.227260 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f85085ef-a23e-41f4-8839-08915aaaef7e" (UID: "f85085ef-a23e-41f4-8839-08915aaaef7e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.253218 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f85085ef-a23e-41f4-8839-08915aaaef7e" (UID: "f85085ef-a23e-41f4-8839-08915aaaef7e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.262104 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02abfc06-bde0-4894-a5f8-f07207f1ba28" path="/var/lib/kubelet/pods/02abfc06-bde0-4894-a5f8-f07207f1ba28/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.267690 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d4b0b6-6bf1-466a-a0b5-dee3b16a533e" path="/var/lib/kubelet/pods/29d4b0b6-6bf1-466a-a0b5-dee3b16a533e/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.275927 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d5ee07-f2ba-4a01-abab-aa8a58056a1b" path="/var/lib/kubelet/pods/38d5ee07-f2ba-4a01-abab-aa8a58056a1b/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.277168 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48898154-9be0-400f-8e0b-ef721132db71" path="/var/lib/kubelet/pods/48898154-9be0-400f-8e0b-ef721132db71/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.279593 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" path="/var/lib/kubelet/pods/4d6754e0-125e-446b-8ef2-fc58883f6c76/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.280786 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" path="/var/lib/kubelet/pods/79f921c6-ec0a-46f5-b3c3-5d479690d0e5/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.281359 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9f1eb0-ad17-4bd0-b554-bff78a522559" path="/var/lib/kubelet/pods/8b9f1eb0-ad17-4bd0-b554-bff78a522559/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.286567 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nf8m\" (UniqueName: \"kubernetes.io/projected/f85085ef-a23e-41f4-8839-08915aaaef7e-kube-api-access-9nf8m\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.286622 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.286636 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.286650 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.286661 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f85085ef-a23e-41f4-8839-08915aaaef7e-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.286673 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85085ef-a23e-41f4-8839-08915aaaef7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.289902 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6a4a416-4347-4df8-80b1-edfa74abfe7e" path="/var/lib/kubelet/pods/a6a4a416-4347-4df8-80b1-edfa74abfe7e/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.290428 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9187249-9aa3-4b9e-a7db-47d95e5c4f6d" path="/var/lib/kubelet/pods/a9187249-9aa3-4b9e-a7db-47d95e5c4f6d/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.291241 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" path="/var/lib/kubelet/pods/aed0e157-f34a-4343-ae3b-71e045eb4cf4/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.292556 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdafa63d-9b24-454c-a217-e53024719e75" path="/var/lib/kubelet/pods/cdafa63d-9b24-454c-a217-e53024719e75/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.293281 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf149136-6376-4e36-96c8-ed8680852c66" path="/var/lib/kubelet/pods/cf149136-6376-4e36-96c8-ed8680852c66/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.293757 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57f121c-42a8-4515-9b9f-f540a3a78b79" path="/var/lib/kubelet/pods/d57f121c-42a8-4515-9b9f-f540a3a78b79/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.295114 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5d487e-8c6a-431b-b720-4b242eec1c40" path="/var/lib/kubelet/pods/ee5d487e-8c6a-431b-b720-4b242eec1c40/volumes" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.388947 5127 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.389034 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts podName:d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:58.389015739 +0000 UTC m=+1528.874918122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts") pod "keystone-9b2b-account-create-update-bcddr" (UID: "d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50") : configmap "openstack-scripts" not found Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.388574 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.389074 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprn8\" (UniqueName: \"kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8\") pod \"keystone-9b2b-account-create-update-bcddr\" (UID: \"d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50\") " pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.391872 5127 projected.go:194] Error preparing data for projected volume kube-api-access-cprn8 for pod openstack/keystone-9b2b-account-create-update-bcddr: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.392147 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8 podName:d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:58.392110353 +0000 UTC m=+1528.878012716 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cprn8" (UniqueName: "kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8") pod "keystone-9b2b-account-create-update-bcddr" (UID: "d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.498854 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" event={"ID":"f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a","Type":"ContainerDied","Data":"717aaaf2700890ab7416f23dfc7faa2c3ae9fd0caa185fb7450b34959c3fc613"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.498901 5127 scope.go:117] "RemoveContainer" containerID="29d8d027dbe06246751c1b56e85016b77f2dd4ca87ded166e55fa2c4832c64ec" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.498986 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7698d9bdb9-bwmxd" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.509704 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.509826 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fd0a3f5a-2119-403c-8b4c-e452465a71e8","Type":"ContainerDied","Data":"6ec24b098801298a2c0a06dff7d2350137852098ef7c5015f145b8edf2995c50"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.520449 5127 generic.go:334] "Generic (PLEG): container finished" podID="644a363d-bd2b-4cb5-81bf-05f7514d7abe" containerID="e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" exitCode=0 Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.520545 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.520567 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"644a363d-bd2b-4cb5-81bf-05f7514d7abe","Type":"ContainerDied","Data":"e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.521369 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"644a363d-bd2b-4cb5-81bf-05f7514d7abe","Type":"ContainerDied","Data":"930039a4d4d51370415637b7777b5091e70fe46a10eda86b6e553d1455075f78"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.528777 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6969499d9b-sjxsr" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.528968 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6969499d9b-sjxsr" event={"ID":"472be6e7-d046-4377-b055-50828b00b8cd","Type":"ContainerDied","Data":"f4c7d48d77b6440e94b4192e6678a4b60093900d8ce01ef00c84f59e13525610"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.534846 5127 generic.go:334] "Generic (PLEG): container finished" podID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerID="603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe" exitCode=0 Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.534914 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f85085ef-a23e-41f4-8839-08915aaaef7e","Type":"ContainerDied","Data":"603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.534941 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f85085ef-a23e-41f4-8839-08915aaaef7e","Type":"ContainerDied","Data":"b52b7fce573e5b869172d9ae9f26a0cacbe7fa672a45539ebf3f37972819a8d7"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.535006 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.542001 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" event={"ID":"b8a6e525-1342-4031-8c3d-5920b8016c8e","Type":"ContainerDied","Data":"c43bdf483ccdb71e73a52ef433ef6721b305b568dabd90800a9967a4bf4cc820"} Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.542084 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6b5bcb8846-2gxlg" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.542272 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7698d9bdb9-bwmxd"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.546924 5127 scope.go:117] "RemoveContainer" containerID="a3a77f3f69d363acbcf4efc5d0f20f16e293179511ce86f0cdd3c1b58066afa5" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.554319 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7698d9bdb9-bwmxd"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.562047 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9b2b-account-create-update-bcddr" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.575483 5127 scope.go:117] "RemoveContainer" containerID="2c5095bb5c19bb3463f1baf571331379f62fe6f5cfacef8ce4ddbb8ec37e07f6" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.597723 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.606356 5127 scope.go:117] "RemoveContainer" containerID="e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.613691 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.620821 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.630613 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.636697 5127 scope.go:117] "RemoveContainer" containerID="e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.637562 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137\": container with ID starting with e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137 not found: ID does not exist" containerID="e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.637610 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137"} err="failed to get container status \"e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137\": rpc error: code = NotFound desc = could not find container \"e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137\": container with ID starting with e1165c866c3c2c4184d897554d3d6d96861047059652b89c3bb869384a933137 not found: ID does not exist" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.637637 5127 scope.go:117] "RemoveContainer" containerID="62c7e7aeed632c501e98dba48dbb0ca73647880b2adcc2c89f331f126d30002a" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.641331 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6969499d9b-sjxsr"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.648508 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6969499d9b-sjxsr"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.656793 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6b5bcb8846-2gxlg"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.676138 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6b5bcb8846-2gxlg"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.698021 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.700892 5127 scope.go:117] "RemoveContainer" containerID="829b8906c7d8a005a0f0715b5027bcf6b0f42ef7cc11158f5d59737c1d368916" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.715469 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.746420 5127 scope.go:117] "RemoveContainer" containerID="603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.775950 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9b2b-account-create-update-bcddr"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.795848 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9b2b-account-create-update-bcddr"] Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.804409 5127 scope.go:117] "RemoveContainer" containerID="f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.808215 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.808333 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:13:04.808278327 +0000 UTC m=+1535.294180690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.840167 5127 scope.go:117] "RemoveContainer" containerID="603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.844388 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe\": container with ID starting with 603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe not found: ID does not exist" containerID="603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.844437 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe"} err="failed to get container status \"603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe\": rpc error: code = NotFound desc = could not find container \"603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe\": container with ID starting with 603efd7253b71ff502ae782e2384e67ade8046375ce086ab4ab1c85c14705ffe not found: ID does not exist" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.844466 5127 scope.go:117] "RemoveContainer" containerID="f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.846623 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a\": container with ID starting with f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a not found: ID does not exist" containerID="f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.846673 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a"} err="failed to get container status \"f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a\": rpc error: code = NotFound desc = could not find container \"f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a\": container with ID starting with f472d063789cd118367e88ffe51ad0f3dabb6a31bdaa49fda614bffb90658e2a not found: ID does not exist" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.846697 5127 scope.go:117] "RemoveContainer" containerID="c29366b00ecfb7dffff5a9a80692040e245c0a01c1fbbaf5d4d101f0738c006c" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.878714 5127 scope.go:117] "RemoveContainer" containerID="a3ab6404657e8a50a3cc043680876f80e95b3982cb32682933e72885d036811f" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.884162 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.885526 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.887259 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.887294 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerName="galera" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.911994 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: I0201 07:12:56.912021 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cprn8\" (UniqueName: \"kubernetes.io/projected/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50-kube-api-access-cprn8\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.912105 5127 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 01 07:12:56 crc kubenswrapper[5127]: E0201 07:12:56.912494 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts podName:61a37fc0-b8b7-4bbc-ab43-2dae28037ee0 nodeName:}" failed. No retries permitted until 2026-02-01 07:12:58.912473165 +0000 UTC m=+1529.398375528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts") pod "root-account-create-update-g7c4t" (UID: "61a37fc0-b8b7-4bbc-ab43-2dae28037ee0") : configmap "openstack-scripts" not found Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.021285 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.115005 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5x42\" (UniqueName: \"kubernetes.io/projected/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-kube-api-access-s5x42\") pod \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.115386 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts\") pod \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\" (UID: \"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.116290 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" (UID: "61a37fc0-b8b7-4bbc-ab43-2dae28037ee0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.121281 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-kube-api-access-s5x42" (OuterVolumeSpecName: "kube-api-access-s5x42") pod "61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" (UID: "61a37fc0-b8b7-4bbc-ab43-2dae28037ee0"). InnerVolumeSpecName "kube-api-access-s5x42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.219615 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.219642 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5x42\" (UniqueName: \"kubernetes.io/projected/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0-kube-api-access-s5x42\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.420313 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527440 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-operator-scripts\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527488 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-galera-tls-certs\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527550 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb5vf\" (UniqueName: \"kubernetes.io/projected/011ed99a-688f-4874-b6f7-f861080ef9d5-kube-api-access-pb5vf\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527573 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-default\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527651 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-combined-ca-bundle\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527742 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-kolla-config\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527765 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-generated\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.527813 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"011ed99a-688f-4874-b6f7-f861080ef9d5\" (UID: \"011ed99a-688f-4874-b6f7-f861080ef9d5\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.529139 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.529255 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.530141 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.530255 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.533555 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/011ed99a-688f-4874-b6f7-f861080ef9d5-kube-api-access-pb5vf" (OuterVolumeSpecName: "kube-api-access-pb5vf") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "kube-api-access-pb5vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.537344 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "mysql-db") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.556558 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.572721 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-g7c4t" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.572720 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-g7c4t" event={"ID":"61a37fc0-b8b7-4bbc-ab43-2dae28037ee0","Type":"ContainerDied","Data":"c7dd845f3b9d3aa1343a1d795c0aa15b072ab1c8aaf2aa2de019e03dcc1a6054"} Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.572896 5127 scope.go:117] "RemoveContainer" containerID="c359e7f7badca9f1e344c4f8fa4ef9cdc47db3820c690b533ab6b480f1e9cced" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.574809 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6c50e0a2-f119-4a1a-911f-f7898cceddb8/ovn-northd/0.log" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.575052 5127 generic.go:334] "Generic (PLEG): container finished" podID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerID="0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625" exitCode=139 Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.575189 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c50e0a2-f119-4a1a-911f-f7898cceddb8","Type":"ContainerDied","Data":"0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625"} Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.575220 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c50e0a2-f119-4a1a-911f-f7898cceddb8","Type":"ContainerDied","Data":"df9362d218c3814fe9e7cc536b2e53bb7d978a7e6c5187140d5d890e8d9c2acd"} Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.575234 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df9362d218c3814fe9e7cc536b2e53bb7d978a7e6c5187140d5d890e8d9c2acd" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.577024 5127 generic.go:334] "Generic (PLEG): container finished" podID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerID="eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" exitCode=0 Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.577081 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"011ed99a-688f-4874-b6f7-f861080ef9d5","Type":"ContainerDied","Data":"eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8"} Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.577146 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"011ed99a-688f-4874-b6f7-f861080ef9d5","Type":"ContainerDied","Data":"3308442a8576384d0e401106f1cd9598832e0b35eafc0ea16cae645cbc9df338"} Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.577112 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.579412 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "011ed99a-688f-4874-b6f7-f861080ef9d5" (UID: "011ed99a-688f-4874-b6f7-f861080ef9d5"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.625356 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6c50e0a2-f119-4a1a-911f-f7898cceddb8/ovn-northd/0.log" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.625424 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.626967 5127 scope.go:117] "RemoveContainer" containerID="eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.630513 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb5vf\" (UniqueName: \"kubernetes.io/projected/011ed99a-688f-4874-b6f7-f861080ef9d5-kube-api-access-pb5vf\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.630538 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.630871 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.630895 5127 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.631038 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/011ed99a-688f-4874-b6f7-f861080ef9d5-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.631404 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.631549 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/011ed99a-688f-4874-b6f7-f861080ef9d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.631566 5127 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/011ed99a-688f-4874-b6f7-f861080ef9d5-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.656548 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-g7c4t"] Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.657699 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.668170 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-g7c4t"] Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.675747 5127 scope.go:117] "RemoveContainer" containerID="2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.694624 5127 scope.go:117] "RemoveContainer" containerID="eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" Feb 01 07:12:57 crc kubenswrapper[5127]: E0201 07:12:57.695041 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8\": container with ID starting with eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8 not found: ID does not exist" containerID="eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.695083 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8"} err="failed to get container status \"eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8\": rpc error: code = NotFound desc = could not find container \"eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8\": container with ID starting with eb1225995542648030392459f107309d97df893b10b29f219fe62a0dd67ef7d8 not found: ID does not exist" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.695112 5127 scope.go:117] "RemoveContainer" containerID="2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4" Feb 01 07:12:57 crc kubenswrapper[5127]: E0201 07:12:57.695422 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4\": container with ID starting with 2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4 not found: ID does not exist" containerID="2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.695453 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4"} err="failed to get container status \"2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4\": rpc error: code = NotFound desc = could not find container \"2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4\": container with ID starting with 2ecf7d9c07e3b144c3922c4392a5d6c0fa5a58d2e892eb3384ea4d9c01649af4 not found: ID does not exist" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732143 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-rundir\") pod \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732243 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-metrics-certs-tls-certs\") pod \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732279 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gchqn\" (UniqueName: \"kubernetes.io/projected/6c50e0a2-f119-4a1a-911f-f7898cceddb8-kube-api-access-gchqn\") pod \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732306 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-combined-ca-bundle\") pod \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732336 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-scripts\") pod \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732374 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-config\") pod \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732467 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-northd-tls-certs\") pod \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\" (UID: \"6c50e0a2-f119-4a1a-911f-f7898cceddb8\") " Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.732785 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.734378 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-config" (OuterVolumeSpecName: "config") pod "6c50e0a2-f119-4a1a-911f-f7898cceddb8" (UID: "6c50e0a2-f119-4a1a-911f-f7898cceddb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.734409 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-scripts" (OuterVolumeSpecName: "scripts") pod "6c50e0a2-f119-4a1a-911f-f7898cceddb8" (UID: "6c50e0a2-f119-4a1a-911f-f7898cceddb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.734614 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "6c50e0a2-f119-4a1a-911f-f7898cceddb8" (UID: "6c50e0a2-f119-4a1a-911f-f7898cceddb8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.737093 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c50e0a2-f119-4a1a-911f-f7898cceddb8-kube-api-access-gchqn" (OuterVolumeSpecName: "kube-api-access-gchqn") pod "6c50e0a2-f119-4a1a-911f-f7898cceddb8" (UID: "6c50e0a2-f119-4a1a-911f-f7898cceddb8"). InnerVolumeSpecName "kube-api-access-gchqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.754452 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c50e0a2-f119-4a1a-911f-f7898cceddb8" (UID: "6c50e0a2-f119-4a1a-911f-f7898cceddb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.785951 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "6c50e0a2-f119-4a1a-911f-f7898cceddb8" (UID: "6c50e0a2-f119-4a1a-911f-f7898cceddb8"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.790885 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6c50e0a2-f119-4a1a-911f-f7898cceddb8" (UID: "6c50e0a2-f119-4a1a-911f-f7898cceddb8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.837871 5127 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.838050 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gchqn\" (UniqueName: \"kubernetes.io/projected/6c50e0a2-f119-4a1a-911f-f7898cceddb8-kube-api-access-gchqn\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.838135 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.838190 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.838253 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c50e0a2-f119-4a1a-911f-f7898cceddb8-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.838305 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.838355 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c50e0a2-f119-4a1a-911f-f7898cceddb8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.916224 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 07:12:57 crc kubenswrapper[5127]: I0201 07:12:57.930067 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.246270 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" path="/var/lib/kubelet/pods/011ed99a-688f-4874-b6f7-f861080ef9d5/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.247106 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="472be6e7-d046-4377-b055-50828b00b8cd" path="/var/lib/kubelet/pods/472be6e7-d046-4377-b055-50828b00b8cd/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.247775 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" path="/var/lib/kubelet/pods/61a37fc0-b8b7-4bbc-ab43-2dae28037ee0/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.248828 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644a363d-bd2b-4cb5-81bf-05f7514d7abe" path="/var/lib/kubelet/pods/644a363d-bd2b-4cb5-81bf-05f7514d7abe/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.249532 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" path="/var/lib/kubelet/pods/b8a6e525-1342-4031-8c3d-5920b8016c8e/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.250264 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50" path="/var/lib/kubelet/pods/d8c6cc1b-1df7-4d1d-822b-c115f3f1ee50/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.251554 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" path="/var/lib/kubelet/pods/f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.252523 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" path="/var/lib/kubelet/pods/f85085ef-a23e-41f4-8839-08915aaaef7e/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.253322 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0a3f5a-2119-403c-8b4c-e452465a71e8" path="/var/lib/kubelet/pods/fd0a3f5a-2119-403c-8b4c-e452465a71e8/volumes" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.609856 5127 generic.go:334] "Generic (PLEG): container finished" podID="adddcef2-e42a-4f9c-a1c9-08b8253e7616" containerID="40b690d53e4e14c7eb51d61afb6d0b0437739a3d5946e3586f5c4d6026b0819a" exitCode=0 Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.609959 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdd8b75cb-lhmbf" event={"ID":"adddcef2-e42a-4f9c-a1c9-08b8253e7616","Type":"ContainerDied","Data":"40b690d53e4e14c7eb51d61afb6d0b0437739a3d5946e3586f5c4d6026b0819a"} Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.611754 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.639308 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.644611 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.806683 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870112 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-scripts\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870168 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-combined-ca-bundle\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870226 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-config-data\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870264 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-public-tls-certs\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870342 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-fernet-keys\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870371 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-internal-tls-certs\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870434 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-849qx\" (UniqueName: \"kubernetes.io/projected/adddcef2-e42a-4f9c-a1c9-08b8253e7616-kube-api-access-849qx\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.870453 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-credential-keys\") pod \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\" (UID: \"adddcef2-e42a-4f9c-a1c9-08b8253e7616\") " Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.876956 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.877068 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.880133 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-scripts" (OuterVolumeSpecName: "scripts") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.881715 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adddcef2-e42a-4f9c-a1c9-08b8253e7616-kube-api-access-849qx" (OuterVolumeSpecName: "kube-api-access-849qx") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "kube-api-access-849qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.920773 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.927690 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-config-data" (OuterVolumeSpecName: "config-data") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.950859 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.961733 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "adddcef2-e42a-4f9c-a1c9-08b8253e7616" (UID: "adddcef2-e42a-4f9c-a1c9-08b8253e7616"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973631 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-849qx\" (UniqueName: \"kubernetes.io/projected/adddcef2-e42a-4f9c-a1c9-08b8253e7616-kube-api-access-849qx\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973726 5127 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973736 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973749 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973757 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973767 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973775 5127 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:58 crc kubenswrapper[5127]: I0201 07:12:58.973798 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adddcef2-e42a-4f9c-a1c9-08b8253e7616-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.208557 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.278951 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-log-httpd\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.279027 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-sg-core-conf-yaml\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.279097 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zx4g\" (UniqueName: \"kubernetes.io/projected/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-kube-api-access-6zx4g\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.279142 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-config-data\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.279178 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-ceilometer-tls-certs\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.279228 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-combined-ca-bundle\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.279293 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-run-httpd\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.279348 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-scripts\") pod \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\" (UID: \"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.282076 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.282374 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.282746 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-scripts" (OuterVolumeSpecName: "scripts") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.286047 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-kube-api-access-6zx4g" (OuterVolumeSpecName: "kube-api-access-6zx4g") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "kube-api-access-6zx4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.304982 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.321924 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.374831 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.381837 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.381869 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.381882 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zx4g\" (UniqueName: \"kubernetes.io/projected/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-kube-api-access-6zx4g\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.381892 5127 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.381901 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.381909 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.381918 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.394643 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-config-data" (OuterVolumeSpecName: "config-data") pod "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" (UID: "d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.483053 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.584989 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.585322 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data podName:824fc658-1c02-4470-9ed3-e4123ddd7575 nodeName:}" failed. No retries permitted until 2026-02-01 07:13:07.585307645 +0000 UTC m=+1538.071210008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data") pod "rabbitmq-cell1-server-0" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575") : configmap "rabbitmq-cell1-config-data" not found Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.598253 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.619508 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5fdd8b75cb-lhmbf" event={"ID":"adddcef2-e42a-4f9c-a1c9-08b8253e7616","Type":"ContainerDied","Data":"8abf53608175bed6672bf9502810cce90ca07b7bbb9ba0772682409192dff0d2"} Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.619565 5127 scope.go:117] "RemoveContainer" containerID="40b690d53e4e14c7eb51d61afb6d0b0437739a3d5946e3586f5c4d6026b0819a" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.619696 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5fdd8b75cb-lhmbf" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.626997 5127 generic.go:334] "Generic (PLEG): container finished" podID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerID="271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e" exitCode=0 Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.627076 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerDied","Data":"271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e"} Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.627105 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd","Type":"ContainerDied","Data":"796502bc0eec651a9fbd813406160e910689ed1dfcf89837fb447ba9e8a1eb13"} Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.627189 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.629176 5127 generic.go:334] "Generic (PLEG): container finished" podID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerID="9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395" exitCode=0 Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.629207 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"824fc658-1c02-4470-9ed3-e4123ddd7575","Type":"ContainerDied","Data":"9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395"} Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.629225 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.629229 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"824fc658-1c02-4470-9ed3-e4123ddd7575","Type":"ContainerDied","Data":"5ed72d99ca67c3ea63ade9c933fdf6ea0af43aed4fdd91e68c06aea804039233"} Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.657293 5127 scope.go:117] "RemoveContainer" containerID="7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.683409 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5fdd8b75cb-lhmbf"] Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.686293 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-plugins\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.686537 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/824fc658-1c02-4470-9ed3-e4123ddd7575-pod-info\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.686714 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.686803 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-confd\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.686897 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/824fc658-1c02-4470-9ed3-e4123ddd7575-erlang-cookie-secret\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.686944 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-erlang-cookie\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.686991 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-server-conf\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.687062 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-plugins-conf\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.687123 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.687169 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7djxw\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-kube-api-access-7djxw\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.687264 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-tls\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.687327 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data\") pod \"824fc658-1c02-4470-9ed3-e4123ddd7575\" (UID: \"824fc658-1c02-4470-9ed3-e4123ddd7575\") " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.688420 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.688784 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.690100 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.690142 5127 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.690159 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.690992 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.693661 5127 scope.go:117] "RemoveContainer" containerID="c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.701655 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5fdd8b75cb-lhmbf"] Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.705437 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824fc658-1c02-4470-9ed3-e4123ddd7575-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.708547 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data" (OuterVolumeSpecName: "config-data") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.709937 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.711916 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/824fc658-1c02-4470-9ed3-e4123ddd7575-pod-info" (OuterVolumeSpecName: "pod-info") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.712268 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-kube-api-access-7djxw" (OuterVolumeSpecName: "kube-api-access-7djxw") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "kube-api-access-7djxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.716120 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.716893 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.722230 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-server-conf" (OuterVolumeSpecName: "server-conf") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.759415 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "824fc658-1c02-4470-9ed3-e4123ddd7575" (UID: "824fc658-1c02-4470-9ed3-e4123ddd7575"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.765207 5127 scope.go:117] "RemoveContainer" containerID="271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.780929 5127 scope.go:117] "RemoveContainer" containerID="1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792112 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792146 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7djxw\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-kube-api-access-7djxw\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792156 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792166 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792175 5127 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/824fc658-1c02-4470-9ed3-e4123ddd7575-pod-info\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792184 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/824fc658-1c02-4470-9ed3-e4123ddd7575-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792193 5127 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/824fc658-1c02-4470-9ed3-e4123ddd7575-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.792202 5127 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/824fc658-1c02-4470-9ed3-e4123ddd7575-server-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.799475 5127 scope.go:117] "RemoveContainer" containerID="7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.799964 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49\": container with ID starting with 7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49 not found: ID does not exist" containerID="7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.799998 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49"} err="failed to get container status \"7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49\": rpc error: code = NotFound desc = could not find container \"7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49\": container with ID starting with 7308e31ab5ef02a195c530f2b22302000a09aba39a4bb94a4f3f9aee93df5b49 not found: ID does not exist" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.800020 5127 scope.go:117] "RemoveContainer" containerID="c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.800376 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357\": container with ID starting with c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357 not found: ID does not exist" containerID="c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.800437 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357"} err="failed to get container status \"c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357\": rpc error: code = NotFound desc = could not find container \"c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357\": container with ID starting with c24c0a2fbaea2952e3b293b22b33b93bebf306a98536f7ec1180a5dde07cc357 not found: ID does not exist" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.800462 5127 scope.go:117] "RemoveContainer" containerID="271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.800802 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e\": container with ID starting with 271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e not found: ID does not exist" containerID="271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.800832 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e"} err="failed to get container status \"271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e\": rpc error: code = NotFound desc = could not find container \"271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e\": container with ID starting with 271011b18f0cfa97dbd9e6c06b7248bd122960affe6b5cfbc156d43e3561244e not found: ID does not exist" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.800847 5127 scope.go:117] "RemoveContainer" containerID="1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.801142 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b\": container with ID starting with 1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b not found: ID does not exist" containerID="1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.801166 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b"} err="failed to get container status \"1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b\": rpc error: code = NotFound desc = could not find container \"1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b\": container with ID starting with 1146755e3371ef0b8bf91836979b500a703db4a442b1d3d0e7fdcf2e2ed3210b not found: ID does not exist" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.801181 5127 scope.go:117] "RemoveContainer" containerID="9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.808696 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.819067 5127 scope.go:117] "RemoveContainer" containerID="d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.838655 5127 scope.go:117] "RemoveContainer" containerID="9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.839092 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395\": container with ID starting with 9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395 not found: ID does not exist" containerID="9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.839144 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395"} err="failed to get container status \"9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395\": rpc error: code = NotFound desc = could not find container \"9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395\": container with ID starting with 9071d3108d95acf363ef443bc8d1d94cbb73d092b9bdd3f52099765c41840395 not found: ID does not exist" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.839177 5127 scope.go:117] "RemoveContainer" containerID="d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.839524 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028\": container with ID starting with d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028 not found: ID does not exist" containerID="d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.839545 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028"} err="failed to get container status \"d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028\": rpc error: code = NotFound desc = could not find container \"d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028\": container with ID starting with d9b67c4cdcdcb6c16ed635f631c71ab4d348974a98d4e0f977e716a1d08cf028 not found: ID does not exist" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.894543 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.940946 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.941209 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.941634 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.941751 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.943350 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.948892 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.951859 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:12:59 crc kubenswrapper[5127]: E0201 07:12:59.951926 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.981419 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:12:59 crc kubenswrapper[5127]: I0201 07:12:59.987821 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.246868 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" path="/var/lib/kubelet/pods/6c50e0a2-f119-4a1a-911f-f7898cceddb8/volumes" Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.248769 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" path="/var/lib/kubelet/pods/824fc658-1c02-4470-9ed3-e4123ddd7575/volumes" Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.249921 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adddcef2-e42a-4f9c-a1c9-08b8253e7616" path="/var/lib/kubelet/pods/adddcef2-e42a-4f9c-a1c9-08b8253e7616/volumes" Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.251458 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" path="/var/lib/kubelet/pods/d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd/volumes" Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.660969 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.661211 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="d440b432-d2ce-4228-90b7-ad0c2e12ec86" containerName="memcached" containerID="cri-o://8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20" gracePeriod=30 Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.681646 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cgmxq"] Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.687009 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cgmxq"] Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.692022 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.692192 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" gracePeriod=30 Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.704697 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.704976 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="9df5c029-e707-4159-b8ec-2fb5dba38094" containerName="nova-cell1-conductor-conductor" containerID="cri-o://c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601" gracePeriod=30 Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.705170 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2zcqd"] Feb 01 07:13:00 crc kubenswrapper[5127]: I0201 07:13:00.709397 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2zcqd"] Feb 01 07:13:00 crc kubenswrapper[5127]: E0201 07:13:00.810423 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:00 crc kubenswrapper[5127]: E0201 07:13:00.811803 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:00 crc kubenswrapper[5127]: E0201 07:13:00.812942 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:00 crc kubenswrapper[5127]: E0201 07:13:00.812968 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="9df5c029-e707-4159-b8ec-2fb5dba38094" containerName="nova-cell1-conductor-conductor" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.248180 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89710835-8f36-4539-90e7-7f442b5fd963" path="/var/lib/kubelet/pods/89710835-8f36-4539-90e7-7f442b5fd963/volumes" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.249291 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea46d9a-92b4-4bd6-b9b2-bf342ad7b350" path="/var/lib/kubelet/pods/aea46d9a-92b4-4bd6-b9b2-bf342ad7b350/volumes" Feb 01 07:13:02 crc kubenswrapper[5127]: E0201 07:13:02.414299 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:02 crc kubenswrapper[5127]: E0201 07:13:02.417886 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:02 crc kubenswrapper[5127]: E0201 07:13:02.432770 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:02 crc kubenswrapper[5127]: E0201 07:13:02.432837 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.459309 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.491929 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-combined-ca-bundle\") pod \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.491993 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-config-data\") pod \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.492081 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrswp\" (UniqueName: \"kubernetes.io/projected/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kube-api-access-rrswp\") pod \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.492107 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kolla-config\") pod \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.492176 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-memcached-tls-certs\") pod \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\" (UID: \"d440b432-d2ce-4228-90b7-ad0c2e12ec86\") " Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.493640 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-config-data" (OuterVolumeSpecName: "config-data") pod "d440b432-d2ce-4228-90b7-ad0c2e12ec86" (UID: "d440b432-d2ce-4228-90b7-ad0c2e12ec86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.494978 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d440b432-d2ce-4228-90b7-ad0c2e12ec86" (UID: "d440b432-d2ce-4228-90b7-ad0c2e12ec86"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.514152 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kube-api-access-rrswp" (OuterVolumeSpecName: "kube-api-access-rrswp") pod "d440b432-d2ce-4228-90b7-ad0c2e12ec86" (UID: "d440b432-d2ce-4228-90b7-ad0c2e12ec86"). InnerVolumeSpecName "kube-api-access-rrswp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.535116 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d440b432-d2ce-4228-90b7-ad0c2e12ec86" (UID: "d440b432-d2ce-4228-90b7-ad0c2e12ec86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.546534 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "d440b432-d2ce-4228-90b7-ad0c2e12ec86" (UID: "d440b432-d2ce-4228-90b7-ad0c2e12ec86"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.593859 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.593905 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.593916 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrswp\" (UniqueName: \"kubernetes.io/projected/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kube-api-access-rrswp\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.593927 5127 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d440b432-d2ce-4228-90b7-ad0c2e12ec86-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.593940 5127 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/d440b432-d2ce-4228-90b7-ad0c2e12ec86-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.660665 5127 generic.go:334] "Generic (PLEG): container finished" podID="d440b432-d2ce-4228-90b7-ad0c2e12ec86" containerID="8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20" exitCode=0 Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.660699 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.660695 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d440b432-d2ce-4228-90b7-ad0c2e12ec86","Type":"ContainerDied","Data":"8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20"} Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.660757 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d440b432-d2ce-4228-90b7-ad0c2e12ec86","Type":"ContainerDied","Data":"41e902a000bb6082673625e78feb3483ce904397675767e01b6a4cc5c37379c8"} Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.660775 5127 scope.go:117] "RemoveContainer" containerID="8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.689890 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.698115 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.700783 5127 scope.go:117] "RemoveContainer" containerID="8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20" Feb 01 07:13:02 crc kubenswrapper[5127]: E0201 07:13:02.701498 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20\": container with ID starting with 8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20 not found: ID does not exist" containerID="8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20" Feb 01 07:13:02 crc kubenswrapper[5127]: I0201 07:13:02.701537 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20"} err="failed to get container status \"8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20\": rpc error: code = NotFound desc = could not find container \"8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20\": container with ID starting with 8e7a5311bc3921094a62f4571fd153fadc7d703566e47fe791047ef652d13c20 not found: ID does not exist" Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.690888 5127 generic.go:334] "Generic (PLEG): container finished" podID="9df5c029-e707-4159-b8ec-2fb5dba38094" containerID="c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601" exitCode=0 Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.691305 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9df5c029-e707-4159-b8ec-2fb5dba38094","Type":"ContainerDied","Data":"c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601"} Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.897208 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.910834 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2nqv\" (UniqueName: \"kubernetes.io/projected/9df5c029-e707-4159-b8ec-2fb5dba38094-kube-api-access-x2nqv\") pod \"9df5c029-e707-4159-b8ec-2fb5dba38094\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.910958 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-config-data\") pod \"9df5c029-e707-4159-b8ec-2fb5dba38094\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.910991 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-combined-ca-bundle\") pod \"9df5c029-e707-4159-b8ec-2fb5dba38094\" (UID: \"9df5c029-e707-4159-b8ec-2fb5dba38094\") " Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.918899 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df5c029-e707-4159-b8ec-2fb5dba38094-kube-api-access-x2nqv" (OuterVolumeSpecName: "kube-api-access-x2nqv") pod "9df5c029-e707-4159-b8ec-2fb5dba38094" (UID: "9df5c029-e707-4159-b8ec-2fb5dba38094"). InnerVolumeSpecName "kube-api-access-x2nqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.951451 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9df5c029-e707-4159-b8ec-2fb5dba38094" (UID: "9df5c029-e707-4159-b8ec-2fb5dba38094"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:03 crc kubenswrapper[5127]: I0201 07:13:03.958533 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-config-data" (OuterVolumeSpecName: "config-data") pod "9df5c029-e707-4159-b8ec-2fb5dba38094" (UID: "9df5c029-e707-4159-b8ec-2fb5dba38094"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.013520 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.013557 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9df5c029-e707-4159-b8ec-2fb5dba38094-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.013571 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2nqv\" (UniqueName: \"kubernetes.io/projected/9df5c029-e707-4159-b8ec-2fb5dba38094-kube-api-access-x2nqv\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.244816 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.247676 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d440b432-d2ce-4228-90b7-ad0c2e12ec86" path="/var/lib/kubelet/pods/d440b432-d2ce-4228-90b7-ad0c2e12ec86/volumes" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.317874 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-ovndb-tls-certs\") pod \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.317933 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvkw9\" (UniqueName: \"kubernetes.io/projected/a63dd2b1-3f35-45bf-8e69-170e3e980eac-kube-api-access-hvkw9\") pod \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.317962 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-config\") pod \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.317978 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-public-tls-certs\") pod \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.317994 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-combined-ca-bundle\") pod \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.318056 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-internal-tls-certs\") pod \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.318080 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-httpd-config\") pod \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\" (UID: \"a63dd2b1-3f35-45bf-8e69-170e3e980eac\") " Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.340109 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63dd2b1-3f35-45bf-8e69-170e3e980eac-kube-api-access-hvkw9" (OuterVolumeSpecName: "kube-api-access-hvkw9") pod "a63dd2b1-3f35-45bf-8e69-170e3e980eac" (UID: "a63dd2b1-3f35-45bf-8e69-170e3e980eac"). InnerVolumeSpecName "kube-api-access-hvkw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.350241 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a63dd2b1-3f35-45bf-8e69-170e3e980eac" (UID: "a63dd2b1-3f35-45bf-8e69-170e3e980eac"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.355228 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-config" (OuterVolumeSpecName: "config") pod "a63dd2b1-3f35-45bf-8e69-170e3e980eac" (UID: "a63dd2b1-3f35-45bf-8e69-170e3e980eac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.362055 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a63dd2b1-3f35-45bf-8e69-170e3e980eac" (UID: "a63dd2b1-3f35-45bf-8e69-170e3e980eac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.363033 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a63dd2b1-3f35-45bf-8e69-170e3e980eac" (UID: "a63dd2b1-3f35-45bf-8e69-170e3e980eac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.370933 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a63dd2b1-3f35-45bf-8e69-170e3e980eac" (UID: "a63dd2b1-3f35-45bf-8e69-170e3e980eac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.398008 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a63dd2b1-3f35-45bf-8e69-170e3e980eac" (UID: "a63dd2b1-3f35-45bf-8e69-170e3e980eac"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.420327 5127 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.420362 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvkw9\" (UniqueName: \"kubernetes.io/projected/a63dd2b1-3f35-45bf-8e69-170e3e980eac-kube-api-access-hvkw9\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.420372 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.420381 5127 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.420390 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.420399 5127 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.420408 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a63dd2b1-3f35-45bf-8e69-170e3e980eac-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.706088 5127 generic.go:334] "Generic (PLEG): container finished" podID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerID="3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef" exitCode=0 Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.706200 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7bcb954fdc-q646r" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.707822 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcb954fdc-q646r" event={"ID":"a63dd2b1-3f35-45bf-8e69-170e3e980eac","Type":"ContainerDied","Data":"3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef"} Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.708265 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7bcb954fdc-q646r" event={"ID":"a63dd2b1-3f35-45bf-8e69-170e3e980eac","Type":"ContainerDied","Data":"de394e66fc791cb4226c4b103b88e2e418c115f640034c3a5719723870a2e0d7"} Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.708297 5127 scope.go:117] "RemoveContainer" containerID="1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.710177 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.712147 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9df5c029-e707-4159-b8ec-2fb5dba38094","Type":"ContainerDied","Data":"8c1c820ec5d7e58917ecb2d3c8c7355c70f4fcebf4b5aabdb47db8afcd051217"} Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.736914 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.748632 5127 scope.go:117] "RemoveContainer" containerID="3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.751247 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.758064 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7bcb954fdc-q646r"] Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.765946 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7bcb954fdc-q646r"] Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.782357 5127 scope.go:117] "RemoveContainer" containerID="1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae" Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.783212 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae\": container with ID starting with 1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae not found: ID does not exist" containerID="1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.783250 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae"} err="failed to get container status \"1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae\": rpc error: code = NotFound desc = could not find container \"1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae\": container with ID starting with 1ba52f2f339f00973c555479b91c648e3fdd60e38765b9f237dc0b99e9a24cae not found: ID does not exist" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.783271 5127 scope.go:117] "RemoveContainer" containerID="3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef" Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.783565 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef\": container with ID starting with 3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef not found: ID does not exist" containerID="3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.783602 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef"} err="failed to get container status \"3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef\": rpc error: code = NotFound desc = could not find container \"3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef\": container with ID starting with 3b5345aa18991cda807bd3f9382964110e22b805c740660f1cc55baf08a58fef not found: ID does not exist" Feb 01 07:13:04 crc kubenswrapper[5127]: I0201 07:13:04.783615 5127 scope.go:117] "RemoveContainer" containerID="c28c5890df4f7838cdc318f5b7ebfaec095f3417c57ddc9d7a804bfdda0a4601" Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.825360 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.825465 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:13:20.82544412 +0000 UTC m=+1551.311346513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.938999 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.939681 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.939847 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.941798 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.941871 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.943062 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.946468 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:04 crc kubenswrapper[5127]: E0201 07:13:04.946539 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:13:06 crc kubenswrapper[5127]: I0201 07:13:06.246028 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df5c029-e707-4159-b8ec-2fb5dba38094" path="/var/lib/kubelet/pods/9df5c029-e707-4159-b8ec-2fb5dba38094/volumes" Feb 01 07:13:06 crc kubenswrapper[5127]: I0201 07:13:06.247919 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" path="/var/lib/kubelet/pods/a63dd2b1-3f35-45bf-8e69-170e3e980eac/volumes" Feb 01 07:13:07 crc kubenswrapper[5127]: E0201 07:13:07.413617 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:07 crc kubenswrapper[5127]: E0201 07:13:07.415926 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:07 crc kubenswrapper[5127]: E0201 07:13:07.417541 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:07 crc kubenswrapper[5127]: E0201 07:13:07.417679 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.937853 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.938614 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.939250 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.939306 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.940097 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.942032 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.943996 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:09 crc kubenswrapper[5127]: E0201 07:13:09.944058 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:13:12 crc kubenswrapper[5127]: E0201 07:13:12.414174 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:12 crc kubenswrapper[5127]: E0201 07:13:12.417243 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:12 crc kubenswrapper[5127]: E0201 07:13:12.419896 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:12 crc kubenswrapper[5127]: E0201 07:13:12.419990 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.938940 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.940007 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.940451 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.940632 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.940652 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.944838 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.946657 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:14 crc kubenswrapper[5127]: E0201 07:13:14.946736 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.672896 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c75j8"] Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673476 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673507 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-api" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673538 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerName="mysql-bootstrap" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673550 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerName="mysql-bootstrap" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673570 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="ovn-northd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673650 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="ovn-northd" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673679 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-central-agent" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673692 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-central-agent" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673720 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673735 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker-log" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673751 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673763 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673785 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerName="mariadb-account-create-update" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673797 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerName="mariadb-account-create-update" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673821 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerName="galera" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673832 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerName="galera" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673853 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d440b432-d2ce-4228-90b7-ad0c2e12ec86" containerName="memcached" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673865 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d440b432-d2ce-4228-90b7-ad0c2e12ec86" containerName="memcached" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673891 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="proxy-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673903 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="proxy-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673923 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerName="setup-container" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673935 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerName="setup-container" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673960 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adddcef2-e42a-4f9c-a1c9-08b8253e7616" containerName="keystone-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.673971 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="adddcef2-e42a-4f9c-a1c9-08b8253e7616" containerName="keystone-api" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.673989 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674000 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674019 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="sg-core" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674032 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="sg-core" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674056 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="openstack-network-exporter" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674067 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="openstack-network-exporter" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674087 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674099 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674113 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-notification-agent" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674125 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-notification-agent" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674140 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674152 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-api" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674177 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674189 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api-log" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674209 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674221 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-api" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674241 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674252 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-log" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674268 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674280 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674303 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674315 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-log" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674338 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674349 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-log" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674366 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-metadata" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674378 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-metadata" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674396 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674408 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener-log" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674426 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674439 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674455 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9df5c029-e707-4159-b8ec-2fb5dba38094" containerName="nova-cell1-conductor-conductor" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674467 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df5c029-e707-4159-b8ec-2fb5dba38094" containerName="nova-cell1-conductor-conductor" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674498 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerName="rabbitmq" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674512 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerName="rabbitmq" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674527 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0a3f5a-2119-403c-8b4c-e452465a71e8" containerName="kube-state-metrics" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674538 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0a3f5a-2119-403c-8b4c-e452465a71e8" containerName="kube-state-metrics" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674561 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674574 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-log" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.674608 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="644a363d-bd2b-4cb5-81bf-05f7514d7abe" containerName="nova-scheduler-scheduler" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674620 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="644a363d-bd2b-4cb5-81bf-05f7514d7abe" containerName="nova-scheduler-scheduler" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674893 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-central-agent" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674912 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="ovn-northd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674930 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674946 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0a3f5a-2119-403c-8b4c-e452465a71e8" containerName="kube-state-metrics" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674966 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerName="mariadb-account-create-update" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.674981 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675001 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675017 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675031 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675047 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="sg-core" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675063 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="644a363d-bd2b-4cb5-81bf-05f7514d7abe" containerName="nova-scheduler-scheduler" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675079 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="ceilometer-notification-agent" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675144 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675170 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675192 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="472be6e7-d046-4377-b055-50828b00b8cd" containerName="barbican-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675207 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed0e157-f34a-4343-ae3b-71e045eb4cf4" containerName="nova-metadata-metadata" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675220 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f921c6-ec0a-46f5-b3c3-5d479690d0e5" containerName="placement-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675239 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85085ef-a23e-41f4-8839-08915aaaef7e" containerName="nova-api-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675252 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="824fc658-1c02-4470-9ed3-e4123ddd7575" containerName="rabbitmq" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675268 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="adddcef2-e42a-4f9c-a1c9-08b8253e7616" containerName="keystone-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675287 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79bb5e5-a6e6-46ee-b04c-bd5249adb8bd" containerName="proxy-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675309 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="011ed99a-688f-4874-b6f7-f861080ef9d5" containerName="galera" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675321 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerName="mariadb-account-create-update" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675342 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9df5c029-e707-4159-b8ec-2fb5dba38094" containerName="nova-cell1-conductor-conductor" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675360 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-api" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675450 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f17b5eda-d0f1-4e8d-a807-cf1a0bb2928a" containerName="barbican-worker" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675468 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d440b432-d2ce-4228-90b7-ad0c2e12ec86" containerName="memcached" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675486 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a6e525-1342-4031-8c3d-5920b8016c8e" containerName="barbican-keystone-listener" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675499 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c50e0a2-f119-4a1a-911f-f7898cceddb8" containerName="openstack-network-exporter" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675518 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6754e0-125e-446b-8ef2-fc58883f6c76" containerName="glance-log" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675535 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63dd2b1-3f35-45bf-8e69-170e3e980eac" containerName="neutron-httpd" Feb 01 07:13:15 crc kubenswrapper[5127]: E0201 07:13:15.675818 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerName="mariadb-account-create-update" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.675836 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a37fc0-b8b7-4bbc-ab43-2dae28037ee0" containerName="mariadb-account-create-update" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.678891 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.695952 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c75j8"] Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.806939 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-utilities\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.807086 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xwpv\" (UniqueName: \"kubernetes.io/projected/f33297fc-1aac-4130-941f-f2f21472f1d4-kube-api-access-7xwpv\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.807345 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-catalog-content\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.909355 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-utilities\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.909465 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xwpv\" (UniqueName: \"kubernetes.io/projected/f33297fc-1aac-4130-941f-f2f21472f1d4-kube-api-access-7xwpv\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.909551 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-catalog-content\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.910121 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-catalog-content\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.910490 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-utilities\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:15 crc kubenswrapper[5127]: I0201 07:13:15.933191 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xwpv\" (UniqueName: \"kubernetes.io/projected/f33297fc-1aac-4130-941f-f2f21472f1d4-kube-api-access-7xwpv\") pod \"redhat-marketplace-c75j8\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:16 crc kubenswrapper[5127]: I0201 07:13:16.022731 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:16 crc kubenswrapper[5127]: I0201 07:13:16.479504 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c75j8"] Feb 01 07:13:16 crc kubenswrapper[5127]: I0201 07:13:16.866982 5127 generic.go:334] "Generic (PLEG): container finished" podID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerID="80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba" exitCode=0 Feb 01 07:13:16 crc kubenswrapper[5127]: I0201 07:13:16.867130 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c75j8" event={"ID":"f33297fc-1aac-4130-941f-f2f21472f1d4","Type":"ContainerDied","Data":"80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba"} Feb 01 07:13:16 crc kubenswrapper[5127]: I0201 07:13:16.867404 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c75j8" event={"ID":"f33297fc-1aac-4130-941f-f2f21472f1d4","Type":"ContainerStarted","Data":"5789f860fd37eb300b7c48cf415f7692c933497fad5fa87a02f211361423946b"} Feb 01 07:13:17 crc kubenswrapper[5127]: E0201 07:13:17.413963 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:17 crc kubenswrapper[5127]: E0201 07:13:17.416761 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:17 crc kubenswrapper[5127]: E0201 07:13:17.419004 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:17 crc kubenswrapper[5127]: E0201 07:13:17.419147 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:17 crc kubenswrapper[5127]: I0201 07:13:17.880819 5127 generic.go:334] "Generic (PLEG): container finished" podID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerID="2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef" exitCode=0 Feb 01 07:13:17 crc kubenswrapper[5127]: I0201 07:13:17.880885 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c75j8" event={"ID":"f33297fc-1aac-4130-941f-f2f21472f1d4","Type":"ContainerDied","Data":"2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef"} Feb 01 07:13:18 crc kubenswrapper[5127]: I0201 07:13:18.895008 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c75j8" event={"ID":"f33297fc-1aac-4130-941f-f2f21472f1d4","Type":"ContainerStarted","Data":"49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f"} Feb 01 07:13:18 crc kubenswrapper[5127]: I0201 07:13:18.921576 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c75j8" podStartSLOduration=2.505569051 podStartE2EDuration="3.921556813s" podCreationTimestamp="2026-02-01 07:13:15 +0000 UTC" firstStartedPulling="2026-02-01 07:13:16.869448109 +0000 UTC m=+1547.355350512" lastFinishedPulling="2026-02-01 07:13:18.285435911 +0000 UTC m=+1548.771338274" observedRunningTime="2026-02-01 07:13:18.916792884 +0000 UTC m=+1549.402695257" watchObservedRunningTime="2026-02-01 07:13:18.921556813 +0000 UTC m=+1549.407459186" Feb 01 07:13:19 crc kubenswrapper[5127]: I0201 07:13:19.905229 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9przj_a3845481-effe-4cb2-9249-e9311df519a0/ovs-vswitchd/0.log" Feb 01 07:13:19 crc kubenswrapper[5127]: I0201 07:13:19.907358 5127 generic.go:334] "Generic (PLEG): container finished" podID="a3845481-effe-4cb2-9249-e9311df519a0" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" exitCode=137 Feb 01 07:13:19 crc kubenswrapper[5127]: I0201 07:13:19.907460 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9przj" event={"ID":"a3845481-effe-4cb2-9249-e9311df519a0","Type":"ContainerDied","Data":"3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea"} Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.939240 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea is running failed: container process not found" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.939297 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.939654 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea is running failed: container process not found" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.939688 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.939952 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea is running failed: container process not found" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.939998 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.940078 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 01 07:13:19 crc kubenswrapper[5127]: E0201 07:13:19.940106 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9przj" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.426530 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9przj_a3845481-effe-4cb2-9249-e9311df519a0/ovs-vswitchd/0.log" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.428176 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.606710 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-log\") pod \"a3845481-effe-4cb2-9249-e9311df519a0\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.606816 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-run\") pod \"a3845481-effe-4cb2-9249-e9311df519a0\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.606867 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3845481-effe-4cb2-9249-e9311df519a0-scripts\") pod \"a3845481-effe-4cb2-9249-e9311df519a0\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.606920 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-etc-ovs\") pod \"a3845481-effe-4cb2-9249-e9311df519a0\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.606946 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw22w\" (UniqueName: \"kubernetes.io/projected/a3845481-effe-4cb2-9249-e9311df519a0-kube-api-access-lw22w\") pod \"a3845481-effe-4cb2-9249-e9311df519a0\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.606961 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-lib\") pod \"a3845481-effe-4cb2-9249-e9311df519a0\" (UID: \"a3845481-effe-4cb2-9249-e9311df519a0\") " Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.607502 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-log" (OuterVolumeSpecName: "var-log") pod "a3845481-effe-4cb2-9249-e9311df519a0" (UID: "a3845481-effe-4cb2-9249-e9311df519a0"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.607521 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-lib" (OuterVolumeSpecName: "var-lib") pod "a3845481-effe-4cb2-9249-e9311df519a0" (UID: "a3845481-effe-4cb2-9249-e9311df519a0"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.607545 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "a3845481-effe-4cb2-9249-e9311df519a0" (UID: "a3845481-effe-4cb2-9249-e9311df519a0"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.607590 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-run" (OuterVolumeSpecName: "var-run") pod "a3845481-effe-4cb2-9249-e9311df519a0" (UID: "a3845481-effe-4cb2-9249-e9311df519a0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.609398 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3845481-effe-4cb2-9249-e9311df519a0-scripts" (OuterVolumeSpecName: "scripts") pod "a3845481-effe-4cb2-9249-e9311df519a0" (UID: "a3845481-effe-4cb2-9249-e9311df519a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.616293 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3845481-effe-4cb2-9249-e9311df519a0-kube-api-access-lw22w" (OuterVolumeSpecName: "kube-api-access-lw22w") pod "a3845481-effe-4cb2-9249-e9311df519a0" (UID: "a3845481-effe-4cb2-9249-e9311df519a0"). InnerVolumeSpecName "kube-api-access-lw22w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.713881 5127 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-log\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.714213 5127 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-run\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.714222 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a3845481-effe-4cb2-9249-e9311df519a0-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.714230 5127 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.714238 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw22w\" (UniqueName: \"kubernetes.io/projected/a3845481-effe-4cb2-9249-e9311df519a0-kube-api-access-lw22w\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.714249 5127 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/a3845481-effe-4cb2-9249-e9311df519a0-var-lib\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:20 crc kubenswrapper[5127]: E0201 07:13:20.919897 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:13:20 crc kubenswrapper[5127]: E0201 07:13:20.920192 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:13:52.920170961 +0000 UTC m=+1583.406073324 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.930241 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerID="2346499dd9e7c21de3823593069c8520d97c16bb6dede126e55ac71fc4a085b0" exitCode=137 Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.930337 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"2346499dd9e7c21de3823593069c8520d97c16bb6dede126e55ac71fc4a085b0"} Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.932846 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9przj_a3845481-effe-4cb2-9249-e9311df519a0/ovs-vswitchd/0.log" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.933434 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9przj" event={"ID":"a3845481-effe-4cb2-9249-e9311df519a0","Type":"ContainerDied","Data":"b408ff1c80292ac5017e44c6075c08c00a28d68c4ade1b10246a67d7709fca75"} Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.933491 5127 scope.go:117] "RemoveContainer" containerID="3fc58fc4098c61f7375e4c51add4f398815d00f85cc7d2628fa5c1840e92caea" Feb 01 07:13:20 crc kubenswrapper[5127]: I0201 07:13:20.933525 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9przj" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.011649 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.012812 5127 scope.go:117] "RemoveContainer" containerID="4e00b98f1952bdc73e6a13a74f9742815a2b243d2fe8e0b7ece9cdc7662d0783" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.013794 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9przj"] Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.020569 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmg5q\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-kube-api-access-xmg5q\") pod \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.020716 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-combined-ca-bundle\") pod \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.020767 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-9przj"] Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.020777 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-lock\") pod \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.020879 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") pod \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.020913 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-cache\") pod \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.020948 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\" (UID: \"7e0ea2ea-fe04-40c8-87d0-1321996cbcba\") " Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.021342 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-lock" (OuterVolumeSpecName: "lock") pod "7e0ea2ea-fe04-40c8-87d0-1321996cbcba" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.022549 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-cache" (OuterVolumeSpecName: "cache") pod "7e0ea2ea-fe04-40c8-87d0-1321996cbcba" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.025931 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-kube-api-access-xmg5q" (OuterVolumeSpecName: "kube-api-access-xmg5q") pod "7e0ea2ea-fe04-40c8-87d0-1321996cbcba" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba"). InnerVolumeSpecName "kube-api-access-xmg5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.026044 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "7e0ea2ea-fe04-40c8-87d0-1321996cbcba" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.027800 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7e0ea2ea-fe04-40c8-87d0-1321996cbcba" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.040953 5127 scope.go:117] "RemoveContainer" containerID="ce93d2dfd29066e0859bd832cc0c9e0c839d3111b25ef6f8c4cfba58f6729a4a" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.123176 5127 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-lock\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.123333 5127 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.123390 5127 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-cache\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.123475 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.127306 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmg5q\" (UniqueName: \"kubernetes.io/projected/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-kube-api-access-xmg5q\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.136442 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.228815 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.323693 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e0ea2ea-fe04-40c8-87d0-1321996cbcba" (UID: "7e0ea2ea-fe04-40c8-87d0-1321996cbcba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.331219 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0ea2ea-fe04-40c8-87d0-1321996cbcba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.956559 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7e0ea2ea-fe04-40c8-87d0-1321996cbcba","Type":"ContainerDied","Data":"4c8bb7179e7ba51f0ef68dc2d28ca9439c5e28f50cae2faea0e11c50c1fdfc5d"} Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.956973 5127 scope.go:117] "RemoveContainer" containerID="2346499dd9e7c21de3823593069c8520d97c16bb6dede126e55ac71fc4a085b0" Feb 01 07:13:21 crc kubenswrapper[5127]: I0201 07:13:21.957192 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.005668 5127 scope.go:117] "RemoveContainer" containerID="005c8c714d8be3311a798fc93522b27e5504130f9c1fa418f83c1ab86906035c" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.018643 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.031348 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.068412 5127 scope.go:117] "RemoveContainer" containerID="5653ed02c5b90531b86d9ac767b79937dac0b76281e108a3a937c34943529698" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.089817 5127 scope.go:117] "RemoveContainer" containerID="95be98dc047c279bbce09d7aa189270919803433cfe5dc74d1073beb651e9b25" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.108811 5127 scope.go:117] "RemoveContainer" containerID="914b8bb69bc3bfe2d7935699ef76aca574042432793c4d5754b940ebe207865b" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.130430 5127 scope.go:117] "RemoveContainer" containerID="d37333ecd6017a5cdc098711dfbdfa4e7ddb88dafd4fb0421fa3c8183a90db30" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.160025 5127 scope.go:117] "RemoveContainer" containerID="761c57fdee0d6b1288274f98290bb8cd974e5bc157c50992d2820212429734cd" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.190140 5127 scope.go:117] "RemoveContainer" containerID="70ea6924342a0f91d794401284c82d8dac971be34d4d31d1be2404903e52efc7" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.211186 5127 scope.go:117] "RemoveContainer" containerID="cd0408279605bd61bef597ecbbdac3b1f047aa35e8239141c5d37982ce44fb47" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.233675 5127 scope.go:117] "RemoveContainer" containerID="d54876c49569e6c608f8538949b55b1c199573d434261c07be1e7783a323003f" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.247386 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" path="/var/lib/kubelet/pods/7e0ea2ea-fe04-40c8-87d0-1321996cbcba/volumes" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.250319 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3845481-effe-4cb2-9249-e9311df519a0" path="/var/lib/kubelet/pods/a3845481-effe-4cb2-9249-e9311df519a0/volumes" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.259217 5127 scope.go:117] "RemoveContainer" containerID="c9704dce7fc0e07cc3a655f4772728e2831f5da440ded50d8d887f0c56f5d13f" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.289861 5127 scope.go:117] "RemoveContainer" containerID="38b76cedf92a4bf003f4c614f64605b3a7cbd585d2e9ecb5e1043de091b2dd25" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.323308 5127 scope.go:117] "RemoveContainer" containerID="2ba3574e531a65aa332d467e2a747abc31633121e49ff04e0c8f64ec009d6670" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.358766 5127 scope.go:117] "RemoveContainer" containerID="d7b1d3dad0001903399762e7d439bb31968d3d63d3ab70bde24fdd9f1e6316ee" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.388565 5127 scope.go:117] "RemoveContainer" containerID="867db1559afad96de83c300d6dc76b9f79d3c9220b0e2eb9728b097d71713a33" Feb 01 07:13:22 crc kubenswrapper[5127]: E0201 07:13:22.419236 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:22 crc kubenswrapper[5127]: E0201 07:13:22.421332 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:22 crc kubenswrapper[5127]: E0201 07:13:22.423432 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:22 crc kubenswrapper[5127]: E0201 07:13:22.423492 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.968328 5127 generic.go:334] "Generic (PLEG): container finished" podID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerID="4af9462363cbd843b41f9156dcb55cc0f9bf5eaaa495c0ecf057de6e3872a505" exitCode=137 Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.968419 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" event={"ID":"1330cbe6-a302-4ac6-89ec-b5f3b5791503","Type":"ContainerDied","Data":"4af9462363cbd843b41f9156dcb55cc0f9bf5eaaa495c0ecf057de6e3872a505"} Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.978793 5127 generic.go:334] "Generic (PLEG): container finished" podID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerID="8aebd54ceb5f8e1f71a8b3d2cb3b9f0e38e504b94503477ac609f10e118d1895" exitCode=137 Feb 01 07:13:22 crc kubenswrapper[5127]: I0201 07:13:22.978868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5759588f57-nkg6k" event={"ID":"7ff7407e-28d1-4e89-829a-72a38dd882d7","Type":"ContainerDied","Data":"8aebd54ceb5f8e1f71a8b3d2cb3b9f0e38e504b94503477ac609f10e118d1895"} Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.064389 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.197600 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330cbe6-a302-4ac6-89ec-b5f3b5791503-logs\") pod \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.197693 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wgt6\" (UniqueName: \"kubernetes.io/projected/1330cbe6-a302-4ac6-89ec-b5f3b5791503-kube-api-access-8wgt6\") pod \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.197770 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data-custom\") pod \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.197808 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data\") pod \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.197851 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-combined-ca-bundle\") pod \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\" (UID: \"1330cbe6-a302-4ac6-89ec-b5f3b5791503\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.200292 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1330cbe6-a302-4ac6-89ec-b5f3b5791503-logs" (OuterVolumeSpecName: "logs") pod "1330cbe6-a302-4ac6-89ec-b5f3b5791503" (UID: "1330cbe6-a302-4ac6-89ec-b5f3b5791503"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.204828 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1330cbe6-a302-4ac6-89ec-b5f3b5791503-kube-api-access-8wgt6" (OuterVolumeSpecName: "kube-api-access-8wgt6") pod "1330cbe6-a302-4ac6-89ec-b5f3b5791503" (UID: "1330cbe6-a302-4ac6-89ec-b5f3b5791503"). InnerVolumeSpecName "kube-api-access-8wgt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.206264 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1330cbe6-a302-4ac6-89ec-b5f3b5791503" (UID: "1330cbe6-a302-4ac6-89ec-b5f3b5791503"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.222232 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1330cbe6-a302-4ac6-89ec-b5f3b5791503" (UID: "1330cbe6-a302-4ac6-89ec-b5f3b5791503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.238293 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data" (OuterVolumeSpecName: "config-data") pod "1330cbe6-a302-4ac6-89ec-b5f3b5791503" (UID: "1330cbe6-a302-4ac6-89ec-b5f3b5791503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.263031 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.299259 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wgt6\" (UniqueName: \"kubernetes.io/projected/1330cbe6-a302-4ac6-89ec-b5f3b5791503-kube-api-access-8wgt6\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.299290 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.299303 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.299314 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1330cbe6-a302-4ac6-89ec-b5f3b5791503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.299325 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1330cbe6-a302-4ac6-89ec-b5f3b5791503-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.400893 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data\") pod \"7ff7407e-28d1-4e89-829a-72a38dd882d7\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.400946 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff7407e-28d1-4e89-829a-72a38dd882d7-logs\") pod \"7ff7407e-28d1-4e89-829a-72a38dd882d7\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.401007 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data-custom\") pod \"7ff7407e-28d1-4e89-829a-72a38dd882d7\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.401044 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5w8x\" (UniqueName: \"kubernetes.io/projected/7ff7407e-28d1-4e89-829a-72a38dd882d7-kube-api-access-l5w8x\") pod \"7ff7407e-28d1-4e89-829a-72a38dd882d7\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.401117 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-combined-ca-bundle\") pod \"7ff7407e-28d1-4e89-829a-72a38dd882d7\" (UID: \"7ff7407e-28d1-4e89-829a-72a38dd882d7\") " Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.401338 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ff7407e-28d1-4e89-829a-72a38dd882d7-logs" (OuterVolumeSpecName: "logs") pod "7ff7407e-28d1-4e89-829a-72a38dd882d7" (UID: "7ff7407e-28d1-4e89-829a-72a38dd882d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.401705 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff7407e-28d1-4e89-829a-72a38dd882d7-logs\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.403890 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ff7407e-28d1-4e89-829a-72a38dd882d7" (UID: "7ff7407e-28d1-4e89-829a-72a38dd882d7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.404750 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff7407e-28d1-4e89-829a-72a38dd882d7-kube-api-access-l5w8x" (OuterVolumeSpecName: "kube-api-access-l5w8x") pod "7ff7407e-28d1-4e89-829a-72a38dd882d7" (UID: "7ff7407e-28d1-4e89-829a-72a38dd882d7"). InnerVolumeSpecName "kube-api-access-l5w8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.419993 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ff7407e-28d1-4e89-829a-72a38dd882d7" (UID: "7ff7407e-28d1-4e89-829a-72a38dd882d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.437034 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data" (OuterVolumeSpecName: "config-data") pod "7ff7407e-28d1-4e89-829a-72a38dd882d7" (UID: "7ff7407e-28d1-4e89-829a-72a38dd882d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.503267 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.503309 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.503321 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff7407e-28d1-4e89-829a-72a38dd882d7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.503331 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5w8x\" (UniqueName: \"kubernetes.io/projected/7ff7407e-28d1-4e89-829a-72a38dd882d7-kube-api-access-l5w8x\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.997100 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5759588f57-nkg6k" Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.997120 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5759588f57-nkg6k" event={"ID":"7ff7407e-28d1-4e89-829a-72a38dd882d7","Type":"ContainerDied","Data":"d28cff8b39b659cdf4b7caa9656f240d35e8ee4cd34b71e9e2c6158d6335ee31"} Feb 01 07:13:23 crc kubenswrapper[5127]: I0201 07:13:23.997232 5127 scope.go:117] "RemoveContainer" containerID="8aebd54ceb5f8e1f71a8b3d2cb3b9f0e38e504b94503477ac609f10e118d1895" Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.003050 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" event={"ID":"1330cbe6-a302-4ac6-89ec-b5f3b5791503","Type":"ContainerDied","Data":"0f9652ab95a79490e372b040fabbf131ef3734793f6de210d667945bbfd031f8"} Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.003134 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b4c49b66-pjvd5" Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.091778 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-76b4c49b66-pjvd5"] Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.099268 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-76b4c49b66-pjvd5"] Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.099605 5127 scope.go:117] "RemoveContainer" containerID="d23c240700660131c41788080e6a4a7bff561ffb7789deafac9dc5832ba94354" Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.117250 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5759588f57-nkg6k"] Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.124007 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5759588f57-nkg6k"] Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.133224 5127 scope.go:117] "RemoveContainer" containerID="4af9462363cbd843b41f9156dcb55cc0f9bf5eaaa495c0ecf057de6e3872a505" Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.156527 5127 scope.go:117] "RemoveContainer" containerID="967655f08a7adad63c2db4ecc270313cb506f4dbb7d2de93e05145f31cc59387" Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.252481 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" path="/var/lib/kubelet/pods/1330cbe6-a302-4ac6-89ec-b5f3b5791503/volumes" Feb 01 07:13:24 crc kubenswrapper[5127]: I0201 07:13:24.253266 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" path="/var/lib/kubelet/pods/7ff7407e-28d1-4e89-829a-72a38dd882d7/volumes" Feb 01 07:13:26 crc kubenswrapper[5127]: I0201 07:13:26.024434 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:26 crc kubenswrapper[5127]: I0201 07:13:26.024542 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:26 crc kubenswrapper[5127]: I0201 07:13:26.099658 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:27 crc kubenswrapper[5127]: I0201 07:13:27.105699 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:27 crc kubenswrapper[5127]: I0201 07:13:27.157490 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c75j8"] Feb 01 07:13:27 crc kubenswrapper[5127]: E0201 07:13:27.413397 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:27 crc kubenswrapper[5127]: E0201 07:13:27.418971 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:27 crc kubenswrapper[5127]: E0201 07:13:27.429067 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 07:13:27 crc kubenswrapper[5127]: E0201 07:13:27.429324 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.074328 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c75j8" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="registry-server" containerID="cri-o://49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f" gracePeriod=2 Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.722634 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.813202 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-catalog-content\") pod \"f33297fc-1aac-4130-941f-f2f21472f1d4\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.813293 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-utilities\") pod \"f33297fc-1aac-4130-941f-f2f21472f1d4\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.813636 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xwpv\" (UniqueName: \"kubernetes.io/projected/f33297fc-1aac-4130-941f-f2f21472f1d4-kube-api-access-7xwpv\") pod \"f33297fc-1aac-4130-941f-f2f21472f1d4\" (UID: \"f33297fc-1aac-4130-941f-f2f21472f1d4\") " Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.814394 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-utilities" (OuterVolumeSpecName: "utilities") pod "f33297fc-1aac-4130-941f-f2f21472f1d4" (UID: "f33297fc-1aac-4130-941f-f2f21472f1d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.814716 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.821853 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f33297fc-1aac-4130-941f-f2f21472f1d4-kube-api-access-7xwpv" (OuterVolumeSpecName: "kube-api-access-7xwpv") pod "f33297fc-1aac-4130-941f-f2f21472f1d4" (UID: "f33297fc-1aac-4130-941f-f2f21472f1d4"). InnerVolumeSpecName "kube-api-access-7xwpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.868408 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f33297fc-1aac-4130-941f-f2f21472f1d4" (UID: "f33297fc-1aac-4130-941f-f2f21472f1d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.916458 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xwpv\" (UniqueName: \"kubernetes.io/projected/f33297fc-1aac-4130-941f-f2f21472f1d4-kube-api-access-7xwpv\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:29 crc kubenswrapper[5127]: I0201 07:13:29.916795 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f33297fc-1aac-4130-941f-f2f21472f1d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.087537 5127 generic.go:334] "Generic (PLEG): container finished" podID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerID="49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f" exitCode=0 Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.087624 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c75j8" event={"ID":"f33297fc-1aac-4130-941f-f2f21472f1d4","Type":"ContainerDied","Data":"49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f"} Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.087659 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c75j8" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.087706 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c75j8" event={"ID":"f33297fc-1aac-4130-941f-f2f21472f1d4","Type":"ContainerDied","Data":"5789f860fd37eb300b7c48cf415f7692c933497fad5fa87a02f211361423946b"} Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.087736 5127 scope.go:117] "RemoveContainer" containerID="49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.141073 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c75j8"] Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.144488 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c75j8"] Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.153921 5127 scope.go:117] "RemoveContainer" containerID="2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.188133 5127 scope.go:117] "RemoveContainer" containerID="80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.207959 5127 scope.go:117] "RemoveContainer" containerID="49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f" Feb 01 07:13:30 crc kubenswrapper[5127]: E0201 07:13:30.208659 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f\": container with ID starting with 49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f not found: ID does not exist" containerID="49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.208725 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f"} err="failed to get container status \"49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f\": rpc error: code = NotFound desc = could not find container \"49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f\": container with ID starting with 49f1bd410e2c9b821f9c4ec866b119307db234b0ba9ca6c56f837be279c7861f not found: ID does not exist" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.208769 5127 scope.go:117] "RemoveContainer" containerID="2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef" Feb 01 07:13:30 crc kubenswrapper[5127]: E0201 07:13:30.209170 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef\": container with ID starting with 2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef not found: ID does not exist" containerID="2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.209235 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef"} err="failed to get container status \"2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef\": rpc error: code = NotFound desc = could not find container \"2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef\": container with ID starting with 2d3537426e7a5321f2e6b5a27392f97cabb0912db1e321648dd5a9dca65fe8ef not found: ID does not exist" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.209279 5127 scope.go:117] "RemoveContainer" containerID="80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba" Feb 01 07:13:30 crc kubenswrapper[5127]: E0201 07:13:30.209869 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba\": container with ID starting with 80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba not found: ID does not exist" containerID="80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.209911 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba"} err="failed to get container status \"80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba\": rpc error: code = NotFound desc = could not find container \"80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba\": container with ID starting with 80f7d95df117b8d48c41d3a02b133c1e87e01bdb3ee134ef1d84b069806815ba not found: ID does not exist" Feb 01 07:13:30 crc kubenswrapper[5127]: I0201 07:13:30.250174 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" path="/var/lib/kubelet/pods/f33297fc-1aac-4130-941f-f2f21472f1d4/volumes" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.092519 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.103227 5127 generic.go:334] "Generic (PLEG): container finished" podID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" exitCode=137 Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.103355 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4bdc00fa-e725-42ac-8336-e78b107b64e6","Type":"ContainerDied","Data":"07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4"} Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.103399 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4bdc00fa-e725-42ac-8336-e78b107b64e6","Type":"ContainerDied","Data":"362d0588973ae23a05220b077099960259624f1e79fbb73e64e23d8616d697c4"} Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.103401 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.103438 5127 scope.go:117] "RemoveContainer" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.132930 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-combined-ca-bundle\") pod \"4bdc00fa-e725-42ac-8336-e78b107b64e6\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.133065 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mldtz\" (UniqueName: \"kubernetes.io/projected/4bdc00fa-e725-42ac-8336-e78b107b64e6-kube-api-access-mldtz\") pod \"4bdc00fa-e725-42ac-8336-e78b107b64e6\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.133171 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-config-data\") pod \"4bdc00fa-e725-42ac-8336-e78b107b64e6\" (UID: \"4bdc00fa-e725-42ac-8336-e78b107b64e6\") " Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.140824 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bdc00fa-e725-42ac-8336-e78b107b64e6-kube-api-access-mldtz" (OuterVolumeSpecName: "kube-api-access-mldtz") pod "4bdc00fa-e725-42ac-8336-e78b107b64e6" (UID: "4bdc00fa-e725-42ac-8336-e78b107b64e6"). InnerVolumeSpecName "kube-api-access-mldtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.143875 5127 scope.go:117] "RemoveContainer" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" Feb 01 07:13:31 crc kubenswrapper[5127]: E0201 07:13:31.145235 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4\": container with ID starting with 07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4 not found: ID does not exist" containerID="07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.145284 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4"} err="failed to get container status \"07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4\": rpc error: code = NotFound desc = could not find container \"07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4\": container with ID starting with 07dde0b752e773303f5085a57d34dfcd5c0ef6c456f3108a75b87093a09763d4 not found: ID does not exist" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.159233 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bdc00fa-e725-42ac-8336-e78b107b64e6" (UID: "4bdc00fa-e725-42ac-8336-e78b107b64e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.163405 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-config-data" (OuterVolumeSpecName: "config-data") pod "4bdc00fa-e725-42ac-8336-e78b107b64e6" (UID: "4bdc00fa-e725-42ac-8336-e78b107b64e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.238454 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mldtz\" (UniqueName: \"kubernetes.io/projected/4bdc00fa-e725-42ac-8336-e78b107b64e6-kube-api-access-mldtz\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.239283 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.239976 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdc00fa-e725-42ac-8336-e78b107b64e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.445528 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:13:31 crc kubenswrapper[5127]: I0201 07:13:31.456771 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 07:13:32 crc kubenswrapper[5127]: I0201 07:13:32.250745 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" path="/var/lib/kubelet/pods/4bdc00fa-e725-42ac-8336-e78b107b64e6/volumes" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.415368 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kd9sq"] Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416497 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-updater" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416524 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-updater" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416543 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416557 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416575 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416622 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416649 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="swift-recon-cron" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416662 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="swift-recon-cron" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416682 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416695 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416721 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416734 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-server" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416753 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416766 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416784 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416799 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-server" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416822 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="extract-content" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416836 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="extract-content" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416857 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker-log" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416869 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker-log" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416891 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server-init" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416905 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server-init" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416921 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416935 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416959 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-reaper" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.416971 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-reaper" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.416992 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417006 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417027 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener-log" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417040 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener-log" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417059 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-expirer" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417073 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-expirer" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417090 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417103 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417122 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="rsync" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417135 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="rsync" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417148 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="registry-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417161 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="registry-server" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417182 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="extract-utilities" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417196 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="extract-utilities" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417217 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417230 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417245 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417258 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-server" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417284 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-updater" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417297 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-updater" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417316 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417329 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417347 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417360 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: E0201 07:13:50.417382 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417394 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417779 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417812 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417829 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417849 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417865 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417885 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-updater" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417906 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417929 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417951 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-updater" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417975 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f33297fc-1aac-4130-941f-f2f21472f1d4" containerName="registry-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.417989 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff7407e-28d1-4e89-829a-72a38dd882d7" containerName="barbican-worker-log" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418009 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-expirer" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418024 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1330cbe6-a302-4ac6-89ec-b5f3b5791503" containerName="barbican-keystone-listener-log" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418040 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovsdb-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418061 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3845481-effe-4cb2-9249-e9311df519a0" containerName="ovs-vswitchd" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418076 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-auditor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418092 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="rsync" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418120 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="swift-recon-cron" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418132 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bdc00fa-e725-42ac-8336-e78b107b64e6" containerName="nova-cell0-conductor-conductor" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418149 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418166 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="account-reaper" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418184 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="object-replicator" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.418198 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0ea2ea-fe04-40c8-87d0-1321996cbcba" containerName="container-server" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.420344 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.432109 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kd9sq"] Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.454934 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs5qb\" (UniqueName: \"kubernetes.io/projected/beae6f58-9321-43ec-b086-436aff74ae30-kube-api-access-fs5qb\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.455090 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-catalog-content\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.455177 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-utilities\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.555938 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-utilities\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.556006 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs5qb\" (UniqueName: \"kubernetes.io/projected/beae6f58-9321-43ec-b086-436aff74ae30-kube-api-access-fs5qb\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.556082 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-catalog-content\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.556513 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-utilities\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.556994 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-catalog-content\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.585212 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs5qb\" (UniqueName: \"kubernetes.io/projected/beae6f58-9321-43ec-b086-436aff74ae30-kube-api-access-fs5qb\") pod \"community-operators-kd9sq\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:50 crc kubenswrapper[5127]: I0201 07:13:50.747028 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:13:51 crc kubenswrapper[5127]: I0201 07:13:51.280818 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kd9sq"] Feb 01 07:13:51 crc kubenswrapper[5127]: W0201 07:13:51.289293 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeae6f58_9321_43ec_b086_436aff74ae30.slice/crio-93fcc97623a65e8a2fb5b27e20b269520ceb27f537fe1d5b8475bedabc9a8438 WatchSource:0}: Error finding container 93fcc97623a65e8a2fb5b27e20b269520ceb27f537fe1d5b8475bedabc9a8438: Status 404 returned error can't find the container with id 93fcc97623a65e8a2fb5b27e20b269520ceb27f537fe1d5b8475bedabc9a8438 Feb 01 07:13:51 crc kubenswrapper[5127]: I0201 07:13:51.348718 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd9sq" event={"ID":"beae6f58-9321-43ec-b086-436aff74ae30","Type":"ContainerStarted","Data":"93fcc97623a65e8a2fb5b27e20b269520ceb27f537fe1d5b8475bedabc9a8438"} Feb 01 07:13:52 crc kubenswrapper[5127]: I0201 07:13:52.364416 5127 generic.go:334] "Generic (PLEG): container finished" podID="beae6f58-9321-43ec-b086-436aff74ae30" containerID="4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2" exitCode=0 Feb 01 07:13:52 crc kubenswrapper[5127]: I0201 07:13:52.364825 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd9sq" event={"ID":"beae6f58-9321-43ec-b086-436aff74ae30","Type":"ContainerDied","Data":"4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2"} Feb 01 07:13:52 crc kubenswrapper[5127]: E0201 07:13:52.997280 5127 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 01 07:13:52 crc kubenswrapper[5127]: E0201 07:13:52.997415 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data podName:23799dc8-9944-4c3d-a0e1-cf99f5cb7998 nodeName:}" failed. No retries permitted until 2026-02-01 07:14:56.997380784 +0000 UTC m=+1647.483283187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data") pod "rabbitmq-server-0" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998") : configmap "rabbitmq-config-data" not found Feb 01 07:13:53 crc kubenswrapper[5127]: I0201 07:13:53.379125 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd9sq" event={"ID":"beae6f58-9321-43ec-b086-436aff74ae30","Type":"ContainerStarted","Data":"9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2"} Feb 01 07:13:53 crc kubenswrapper[5127]: E0201 07:13:53.984490 5127 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 01 07:13:53 crc kubenswrapper[5127]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Most common reasons for this are: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 01 07:13:53 crc kubenswrapper[5127]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is not running Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: In addition to the diagnostics info below: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 01 07:13:53 crc kubenswrapper[5127]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Feb 01 07:13:53 crc kubenswrapper[5127]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: DIAGNOSTICS Feb 01 07:13:53 crc kubenswrapper[5127]: =========== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Feb 01 07:13:53 crc kubenswrapper[5127]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Current node details: Feb 01 07:13:53 crc kubenswrapper[5127]: * node name: 'rabbitmqcli-88-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Feb 01 07:13:53 crc kubenswrapper[5127]: * effective user's home directory: /var/lib/rabbitmq Feb 01 07:13:53 crc kubenswrapper[5127]: * Erlang cookie hash: UFEV+sTYlyNb1kYxJq6/4Q== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Most common reasons for this are: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 01 07:13:53 crc kubenswrapper[5127]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is not running Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: In addition to the diagnostics info below: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 01 07:13:53 crc kubenswrapper[5127]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Feb 01 07:13:53 crc kubenswrapper[5127]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: DIAGNOSTICS Feb 01 07:13:53 crc kubenswrapper[5127]: =========== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Feb 01 07:13:53 crc kubenswrapper[5127]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Current node details: Feb 01 07:13:53 crc kubenswrapper[5127]: * node name: 'rabbitmqcli-458-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Feb 01 07:13:53 crc kubenswrapper[5127]: * effective user's home directory: /var/lib/rabbitmq Feb 01 07:13:53 crc kubenswrapper[5127]: * Erlang cookie hash: UFEV+sTYlyNb1kYxJq6/4Q== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: > execCommand=["/bin/bash","-c","if [ ! -z \"$(cat /etc/pod-info/skipPreStopChecks)\" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 \u0026\u0026 rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true \u0026\u0026 rabbitmq-upgrade drain -t 604800"] containerName="rabbitmq" pod="openstack/rabbitmq-server-0" message=< Feb 01 07:13:53 crc kubenswrapper[5127]: Will put node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack into maintenance mode. The node will no longer serve any client traffic! Feb 01 07:13:53 crc kubenswrapper[5127]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Most common reasons for this are: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 01 07:13:53 crc kubenswrapper[5127]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is not running Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: In addition to the diagnostics info below: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 01 07:13:53 crc kubenswrapper[5127]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Feb 01 07:13:53 crc kubenswrapper[5127]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: DIAGNOSTICS Feb 01 07:13:53 crc kubenswrapper[5127]: =========== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Feb 01 07:13:53 crc kubenswrapper[5127]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Current node details: Feb 01 07:13:53 crc kubenswrapper[5127]: * node name: 'rabbitmqcli-88-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Feb 01 07:13:53 crc kubenswrapper[5127]: * effective user's home directory: /var/lib/rabbitmq Feb 01 07:13:53 crc kubenswrapper[5127]: * Erlang cookie hash: UFEV+sTYlyNb1kYxJq6/4Q== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Most common reasons for this are: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 01 07:13:53 crc kubenswrapper[5127]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is not running Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: In addition to the diagnostics info below: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 01 07:13:53 crc kubenswrapper[5127]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Feb 01 07:13:53 crc kubenswrapper[5127]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: DIAGNOSTICS Feb 01 07:13:53 crc kubenswrapper[5127]: =========== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Feb 01 07:13:53 crc kubenswrapper[5127]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Current node details: Feb 01 07:13:53 crc kubenswrapper[5127]: * node name: 'rabbitmqcli-458-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Feb 01 07:13:53 crc kubenswrapper[5127]: * effective user's home directory: /var/lib/rabbitmq Feb 01 07:13:53 crc kubenswrapper[5127]: * Erlang cookie hash: UFEV+sTYlyNb1kYxJq6/4Q== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: > Feb 01 07:13:53 crc kubenswrapper[5127]: E0201 07:13:53.984608 5127 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 01 07:13:53 crc kubenswrapper[5127]: command '/bin/bash -c if [ ! -z "$(cat /etc/pod-info/skipPreStopChecks)" ]; then exit 0; fi; rabbitmq-upgrade await_online_quorum_plus_one -t 604800 && rabbitmq-upgrade await_online_synchronized_mirror -t 604800 || true && rabbitmq-upgrade drain -t 604800' exited with 69: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Most common reasons for this are: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 01 07:13:53 crc kubenswrapper[5127]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is not running Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: In addition to the diagnostics info below: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 01 07:13:53 crc kubenswrapper[5127]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Feb 01 07:13:53 crc kubenswrapper[5127]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: DIAGNOSTICS Feb 01 07:13:53 crc kubenswrapper[5127]: =========== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Feb 01 07:13:53 crc kubenswrapper[5127]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Current node details: Feb 01 07:13:53 crc kubenswrapper[5127]: * node name: 'rabbitmqcli-88-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Feb 01 07:13:53 crc kubenswrapper[5127]: * effective user's home directory: /var/lib/rabbitmq Feb 01 07:13:53 crc kubenswrapper[5127]: * Erlang cookie hash: UFEV+sTYlyNb1kYxJq6/4Q== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Error: unable to perform an operation on node 'rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'. Please see diagnostics information and suggestions below. Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Most common reasons for this are: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is unreachable (e.g. due to hostname resolution, TCP connection or firewall issues) Feb 01 07:13:53 crc kubenswrapper[5127]: * CLI tool fails to authenticate with the server (e.g. due to CLI tool's Erlang cookie not matching that of the server) Feb 01 07:13:53 crc kubenswrapper[5127]: * Target node is not running Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: In addition to the diagnostics info below: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: * See the CLI, clustering and networking guides on https://rabbitmq.com/documentation.html to learn more Feb 01 07:13:53 crc kubenswrapper[5127]: * Consult server logs on node rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack Feb 01 07:13:53 crc kubenswrapper[5127]: * If target node is configured to use long node names, don't forget to use --longnames with CLI tools Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: DIAGNOSTICS Feb 01 07:13:53 crc kubenswrapper[5127]: =========== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: attempted to contact: ['rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack'] Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack: Feb 01 07:13:53 crc kubenswrapper[5127]: * unable to connect to epmd (port 4369) on rabbitmq-server-0.rabbitmq-nodes.openstack: nxdomain (non-existing domain) Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: Current node details: Feb 01 07:13:53 crc kubenswrapper[5127]: * node name: 'rabbitmqcli-458-rabbit@rabbitmq-server-0.rabbitmq-nodes.openstack' Feb 01 07:13:53 crc kubenswrapper[5127]: * effective user's home directory: /var/lib/rabbitmq Feb 01 07:13:53 crc kubenswrapper[5127]: * Erlang cookie hash: UFEV+sTYlyNb1kYxJq6/4Q== Feb 01 07:13:53 crc kubenswrapper[5127]: Feb 01 07:13:53 crc kubenswrapper[5127]: > pod="openstack/rabbitmq-server-0" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="rabbitmq" containerID="cri-o://c7124acccfb475012da9199fb03b9140f599ad33e538d0d8d5c664659f9b893f" Feb 01 07:13:53 crc kubenswrapper[5127]: I0201 07:13:53.984684 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="rabbitmq" containerID="cri-o://c7124acccfb475012da9199fb03b9140f599ad33e538d0d8d5c664659f9b893f" gracePeriod=604738 Feb 01 07:13:54 crc kubenswrapper[5127]: I0201 07:13:54.394878 5127 generic.go:334] "Generic (PLEG): container finished" podID="beae6f58-9321-43ec-b086-436aff74ae30" containerID="9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2" exitCode=0 Feb 01 07:13:54 crc kubenswrapper[5127]: I0201 07:13:54.395002 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd9sq" event={"ID":"beae6f58-9321-43ec-b086-436aff74ae30","Type":"ContainerDied","Data":"9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2"} Feb 01 07:13:54 crc kubenswrapper[5127]: I0201 07:13:54.789858 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Feb 01 07:13:55 crc kubenswrapper[5127]: I0201 07:13:55.410047 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd9sq" event={"ID":"beae6f58-9321-43ec-b086-436aff74ae30","Type":"ContainerStarted","Data":"5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787"} Feb 01 07:13:55 crc kubenswrapper[5127]: I0201 07:13:55.433895 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kd9sq" podStartSLOduration=2.996757368 podStartE2EDuration="5.433870873s" podCreationTimestamp="2026-02-01 07:13:50 +0000 UTC" firstStartedPulling="2026-02-01 07:13:52.370030349 +0000 UTC m=+1582.855932742" lastFinishedPulling="2026-02-01 07:13:54.807143854 +0000 UTC m=+1585.293046247" observedRunningTime="2026-02-01 07:13:55.430471951 +0000 UTC m=+1585.916374394" watchObservedRunningTime="2026-02-01 07:13:55.433870873 +0000 UTC m=+1585.919773276" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.487608 5127 generic.go:334] "Generic (PLEG): container finished" podID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerID="c7124acccfb475012da9199fb03b9140f599ad33e538d0d8d5c664659f9b893f" exitCode=0 Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.487953 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"23799dc8-9944-4c3d-a0e1-cf99f5cb7998","Type":"ContainerDied","Data":"c7124acccfb475012da9199fb03b9140f599ad33e538d0d8d5c664659f9b893f"} Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.597570 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728049 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-tls\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728093 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728130 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728156 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-confd\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728188 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-server-conf\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728208 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-pod-info\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728242 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-plugins\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728270 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-erlang-cookie-secret\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728296 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-plugins-conf\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728321 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-erlang-cookie\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.728381 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvkx9\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-kube-api-access-hvkx9\") pod \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\" (UID: \"23799dc8-9944-4c3d-a0e1-cf99f5cb7998\") " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.729511 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.729647 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.730153 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.732917 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.733508 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.734358 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.734560 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-kube-api-access-hvkx9" (OuterVolumeSpecName: "kube-api-access-hvkx9") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "kube-api-access-hvkx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.736501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-pod-info" (OuterVolumeSpecName: "pod-info") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.747878 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.747951 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.757411 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data" (OuterVolumeSpecName: "config-data") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.788621 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-server-conf" (OuterVolumeSpecName: "server-conf") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.799233 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.833385 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834056 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834080 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834096 5127 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-server-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834112 5127 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-pod-info\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834128 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834145 5127 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834160 5127 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834174 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.834191 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvkx9\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-kube-api-access-hvkx9\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.848425 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.859298 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "23799dc8-9944-4c3d-a0e1-cf99f5cb7998" (UID: "23799dc8-9944-4c3d-a0e1-cf99f5cb7998"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.936630 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/23799dc8-9944-4c3d-a0e1-cf99f5cb7998-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:00 crc kubenswrapper[5127]: I0201 07:14:00.936683 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.500358 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"23799dc8-9944-4c3d-a0e1-cf99f5cb7998","Type":"ContainerDied","Data":"5ea2fea9b3ea9c3b8acac3ba2ee394c286b0ef56b1eb9cf2a409be6533288b3a"} Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.500418 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.500442 5127 scope.go:117] "RemoveContainer" containerID="c7124acccfb475012da9199fb03b9140f599ad33e538d0d8d5c664659f9b893f" Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.541642 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.550100 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.550345 5127 scope.go:117] "RemoveContainer" containerID="c82dbbe0eb6ca71336161015ea284573bf1cf53a6b5fb5824650267c1ab2d8a7" Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.576953 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:14:01 crc kubenswrapper[5127]: I0201 07:14:01.640667 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kd9sq"] Feb 01 07:14:02 crc kubenswrapper[5127]: I0201 07:14:02.246150 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" path="/var/lib/kubelet/pods/23799dc8-9944-4c3d-a0e1-cf99f5cb7998/volumes" Feb 01 07:14:03 crc kubenswrapper[5127]: I0201 07:14:03.528575 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kd9sq" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="registry-server" containerID="cri-o://5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787" gracePeriod=2 Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.064689 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.186465 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-utilities\") pod \"beae6f58-9321-43ec-b086-436aff74ae30\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.186652 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs5qb\" (UniqueName: \"kubernetes.io/projected/beae6f58-9321-43ec-b086-436aff74ae30-kube-api-access-fs5qb\") pod \"beae6f58-9321-43ec-b086-436aff74ae30\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.189121 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-catalog-content\") pod \"beae6f58-9321-43ec-b086-436aff74ae30\" (UID: \"beae6f58-9321-43ec-b086-436aff74ae30\") " Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.188890 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-utilities" (OuterVolumeSpecName: "utilities") pod "beae6f58-9321-43ec-b086-436aff74ae30" (UID: "beae6f58-9321-43ec-b086-436aff74ae30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.193818 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beae6f58-9321-43ec-b086-436aff74ae30-kube-api-access-fs5qb" (OuterVolumeSpecName: "kube-api-access-fs5qb") pod "beae6f58-9321-43ec-b086-436aff74ae30" (UID: "beae6f58-9321-43ec-b086-436aff74ae30"). InnerVolumeSpecName "kube-api-access-fs5qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.296403 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs5qb\" (UniqueName: \"kubernetes.io/projected/beae6f58-9321-43ec-b086-436aff74ae30-kube-api-access-fs5qb\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.296454 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.410319 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "beae6f58-9321-43ec-b086-436aff74ae30" (UID: "beae6f58-9321-43ec-b086-436aff74ae30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.500107 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beae6f58-9321-43ec-b086-436aff74ae30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.543481 5127 generic.go:334] "Generic (PLEG): container finished" podID="beae6f58-9321-43ec-b086-436aff74ae30" containerID="5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787" exitCode=0 Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.543537 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd9sq" event={"ID":"beae6f58-9321-43ec-b086-436aff74ae30","Type":"ContainerDied","Data":"5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787"} Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.543614 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kd9sq" event={"ID":"beae6f58-9321-43ec-b086-436aff74ae30","Type":"ContainerDied","Data":"93fcc97623a65e8a2fb5b27e20b269520ceb27f537fe1d5b8475bedabc9a8438"} Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.543638 5127 scope.go:117] "RemoveContainer" containerID="5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.543653 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kd9sq" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.584665 5127 scope.go:117] "RemoveContainer" containerID="9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.608992 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kd9sq"] Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.616787 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kd9sq"] Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.630654 5127 scope.go:117] "RemoveContainer" containerID="4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.670279 5127 scope.go:117] "RemoveContainer" containerID="5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787" Feb 01 07:14:04 crc kubenswrapper[5127]: E0201 07:14:04.670969 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787\": container with ID starting with 5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787 not found: ID does not exist" containerID="5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.671022 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787"} err="failed to get container status \"5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787\": rpc error: code = NotFound desc = could not find container \"5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787\": container with ID starting with 5a3892b4a7c91b221516d97a81beecb5cbe58635ed0b88ef385f5a0894883787 not found: ID does not exist" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.671060 5127 scope.go:117] "RemoveContainer" containerID="9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2" Feb 01 07:14:04 crc kubenswrapper[5127]: E0201 07:14:04.671935 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2\": container with ID starting with 9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2 not found: ID does not exist" containerID="9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.672089 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2"} err="failed to get container status \"9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2\": rpc error: code = NotFound desc = could not find container \"9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2\": container with ID starting with 9c23b392e1361ba89fe55ece4406ba85cef8cf6c663d306467b3573b631abdd2 not found: ID does not exist" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.672250 5127 scope.go:117] "RemoveContainer" containerID="4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2" Feb 01 07:14:04 crc kubenswrapper[5127]: E0201 07:14:04.672941 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2\": container with ID starting with 4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2 not found: ID does not exist" containerID="4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2" Feb 01 07:14:04 crc kubenswrapper[5127]: I0201 07:14:04.672985 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2"} err="failed to get container status \"4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2\": rpc error: code = NotFound desc = could not find container \"4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2\": container with ID starting with 4d7c23d3358551e1ffeddbf6a342d37518c0da4654869dc75a0fc766712622b2 not found: ID does not exist" Feb 01 07:14:06 crc kubenswrapper[5127]: I0201 07:14:06.252041 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beae6f58-9321-43ec-b086-436aff74ae30" path="/var/lib/kubelet/pods/beae6f58-9321-43ec-b086-436aff74ae30/volumes" Feb 01 07:14:36 crc kubenswrapper[5127]: I0201 07:14:36.741228 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:14:36 crc kubenswrapper[5127]: I0201 07:14:36.742026 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:14:37 crc kubenswrapper[5127]: I0201 07:14:37.983598 5127 scope.go:117] "RemoveContainer" containerID="050ffe26fe0782c1f8580da18ef585a4bde646662286075d7a19ca9b46fc1466" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.027687 5127 scope.go:117] "RemoveContainer" containerID="9736798c00ea577ff511a799e202c624f7065f1a456d8c24e360b90f890a6de7" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.078339 5127 scope.go:117] "RemoveContainer" containerID="e60c44613d67c39a8a8a24961d2a3544213836e4d6601012971fcd1537cbd5f1" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.115572 5127 scope.go:117] "RemoveContainer" containerID="dbeddcf12ac584f74f1a4862f408fe746c0102cea65f98e31f2a82882322563d" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.144655 5127 scope.go:117] "RemoveContainer" containerID="4daa23be369e6bde540ee4a93b4b832a252438b593a4db6bfafcc6fee04ab689" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.196276 5127 scope.go:117] "RemoveContainer" containerID="2bd659b931ab10a61b286761cd4b38488cdd2ed33afa200b2a1085ee2b5b0190" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.215230 5127 scope.go:117] "RemoveContainer" containerID="0aebb18475feef009eca7149a731d855a39ac1c88a40db2d450efef5b8a28625" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.242743 5127 scope.go:117] "RemoveContainer" containerID="b45bb4738590ea1089edb04db72919b56e3e92a86966c7ea321a6b8125920f90" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.261077 5127 scope.go:117] "RemoveContainer" containerID="2245f9d4930b5b4a91b37b1a99574bd64676e417ddb0ffd544851354e99a25a4" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.283488 5127 scope.go:117] "RemoveContainer" containerID="31960747ee7b4ab9611882d5c02143297f6b9aee515db90ed14f82b4b29d99ba" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.310326 5127 scope.go:117] "RemoveContainer" containerID="bea40ed57ae34e892c6123f528c511c5b0399243d5f26fad517fdb141bad12ef" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.330441 5127 scope.go:117] "RemoveContainer" containerID="1748afa06e5723369ad46d3a5f7b64ed4bb8c6fb93005478eccce92fe1125025" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.365774 5127 scope.go:117] "RemoveContainer" containerID="9f5b2e615ef47dfbbbd7c69e708ad20e93453ce23c36b600cace53513255813f" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.382762 5127 scope.go:117] "RemoveContainer" containerID="88291eba0b8528cf903ca664c25bb565e51ca8aefe36f5d3a39b3e4c18656b90" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.406265 5127 scope.go:117] "RemoveContainer" containerID="f7d727be361972646bce2059917ea1d6779b17a04bb30dbfaa2297ebf735b61e" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.425170 5127 scope.go:117] "RemoveContainer" containerID="a9bda3ed30a3b102ae5aafe7c3aa829f283282c23e2e72306ed45823288d360e" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.453509 5127 scope.go:117] "RemoveContainer" containerID="affbafd2f2595fb216b991e56f47604b5bf279799e0e1cdba3730599a76111ab" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.481894 5127 scope.go:117] "RemoveContainer" containerID="d161c03d918bf5904d9550124bdb130cd830048ad05dc7bf70edc02f5386bc0c" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.500221 5127 scope.go:117] "RemoveContainer" containerID="edfa52efc00b790e97ee37c12541b062c301d11ef0289437fc3e2801ea12e4c2" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.521849 5127 scope.go:117] "RemoveContainer" containerID="e1a7c551036a87709431fd53fe21f4c8c165139e9e543a5271f4b6cc8e281722" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.539070 5127 scope.go:117] "RemoveContainer" containerID="69e083b16f8b5c507913fa804b7586d48a866787dd25aea8f10ed56919839621" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.574124 5127 scope.go:117] "RemoveContainer" containerID="7b50a2b6aa5d8e197f639de042061583730b740728b0fe70233696ba7d2e113e" Feb 01 07:14:38 crc kubenswrapper[5127]: I0201 07:14:38.599905 5127 scope.go:117] "RemoveContainer" containerID="44b08e72c489b008fa46527782b6bdc9a481d3a4439b530c26416808e1a4301f" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.161863 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9"] Feb 01 07:15:00 crc kubenswrapper[5127]: E0201 07:15:00.163225 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="extract-content" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.163259 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="extract-content" Feb 01 07:15:00 crc kubenswrapper[5127]: E0201 07:15:00.163294 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="rabbitmq" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.163310 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="rabbitmq" Feb 01 07:15:00 crc kubenswrapper[5127]: E0201 07:15:00.163337 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="extract-utilities" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.163355 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="extract-utilities" Feb 01 07:15:00 crc kubenswrapper[5127]: E0201 07:15:00.163378 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="registry-server" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.163398 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="registry-server" Feb 01 07:15:00 crc kubenswrapper[5127]: E0201 07:15:00.163433 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="setup-container" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.163450 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="setup-container" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.163836 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="beae6f58-9321-43ec-b086-436aff74ae30" containerName="registry-server" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.163882 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="23799dc8-9944-4c3d-a0e1-cf99f5cb7998" containerName="rabbitmq" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.164914 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.167496 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.168855 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9"] Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.173672 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.280482 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-config-volume\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.280895 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdx8\" (UniqueName: \"kubernetes.io/projected/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-kube-api-access-nmdx8\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.281260 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-secret-volume\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.382782 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmdx8\" (UniqueName: \"kubernetes.io/projected/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-kube-api-access-nmdx8\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.382873 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-secret-volume\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.383032 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-config-volume\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.384897 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-config-volume\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.393103 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-secret-volume\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.413745 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmdx8\" (UniqueName: \"kubernetes.io/projected/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-kube-api-access-nmdx8\") pod \"collect-profiles-29498835-m57d9\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.485853 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:00 crc kubenswrapper[5127]: I0201 07:15:00.962370 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9"] Feb 01 07:15:01 crc kubenswrapper[5127]: I0201 07:15:01.156863 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" event={"ID":"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b","Type":"ContainerStarted","Data":"0f736b301a4ac0c5f2bb80e2b13b786d8e9db78740845d09ca6a1a4406ff7be8"} Feb 01 07:15:01 crc kubenswrapper[5127]: I0201 07:15:01.178692 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" podStartSLOduration=1.178667129 podStartE2EDuration="1.178667129s" podCreationTimestamp="2026-02-01 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 07:15:01.173451568 +0000 UTC m=+1651.659353931" watchObservedRunningTime="2026-02-01 07:15:01.178667129 +0000 UTC m=+1651.664569502" Feb 01 07:15:02 crc kubenswrapper[5127]: I0201 07:15:02.164269 5127 generic.go:334] "Generic (PLEG): container finished" podID="a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" containerID="22dc9a98a17b06e1c8d657e2785592225469cee52b8a58d3e627bce4bf34b3fb" exitCode=0 Feb 01 07:15:02 crc kubenswrapper[5127]: I0201 07:15:02.164345 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" event={"ID":"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b","Type":"ContainerDied","Data":"22dc9a98a17b06e1c8d657e2785592225469cee52b8a58d3e627bce4bf34b3fb"} Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.561147 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.730280 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmdx8\" (UniqueName: \"kubernetes.io/projected/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-kube-api-access-nmdx8\") pod \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.730920 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-secret-volume\") pod \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.731160 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-config-volume\") pod \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\" (UID: \"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b\") " Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.732163 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-config-volume" (OuterVolumeSpecName: "config-volume") pod "a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" (UID: "a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.736894 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" (UID: "a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.742371 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-kube-api-access-nmdx8" (OuterVolumeSpecName: "kube-api-access-nmdx8") pod "a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" (UID: "a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b"). InnerVolumeSpecName "kube-api-access-nmdx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.832991 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.833040 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:15:03 crc kubenswrapper[5127]: I0201 07:15:03.833063 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmdx8\" (UniqueName: \"kubernetes.io/projected/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b-kube-api-access-nmdx8\") on node \"crc\" DevicePath \"\"" Feb 01 07:15:04 crc kubenswrapper[5127]: I0201 07:15:04.179387 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" event={"ID":"a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b","Type":"ContainerDied","Data":"0f736b301a4ac0c5f2bb80e2b13b786d8e9db78740845d09ca6a1a4406ff7be8"} Feb 01 07:15:04 crc kubenswrapper[5127]: I0201 07:15:04.179426 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f736b301a4ac0c5f2bb80e2b13b786d8e9db78740845d09ca6a1a4406ff7be8" Feb 01 07:15:04 crc kubenswrapper[5127]: I0201 07:15:04.179460 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9" Feb 01 07:15:06 crc kubenswrapper[5127]: I0201 07:15:06.740438 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:15:06 crc kubenswrapper[5127]: I0201 07:15:06.740857 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:15:36 crc kubenswrapper[5127]: I0201 07:15:36.741515 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:15:36 crc kubenswrapper[5127]: I0201 07:15:36.742425 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:15:36 crc kubenswrapper[5127]: I0201 07:15:36.742515 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:15:36 crc kubenswrapper[5127]: I0201 07:15:36.743670 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:15:36 crc kubenswrapper[5127]: I0201 07:15:36.743811 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" gracePeriod=600 Feb 01 07:15:36 crc kubenswrapper[5127]: E0201 07:15:36.881992 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:15:37 crc kubenswrapper[5127]: I0201 07:15:37.490772 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" exitCode=0 Feb 01 07:15:37 crc kubenswrapper[5127]: I0201 07:15:37.490836 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e"} Feb 01 07:15:37 crc kubenswrapper[5127]: I0201 07:15:37.490896 5127 scope.go:117] "RemoveContainer" containerID="ea328ac3a1fecb168f70daa3f3e516c02a9891b33e1e0a73db9093353737c6c6" Feb 01 07:15:37 crc kubenswrapper[5127]: I0201 07:15:37.491502 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:15:37 crc kubenswrapper[5127]: E0201 07:15:37.491960 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:15:39 crc kubenswrapper[5127]: I0201 07:15:39.037359 5127 scope.go:117] "RemoveContainer" containerID="b5ad36da14aa49f0c289fd664bfe924f8b83d9894729487e9da0fdae6ee61006" Feb 01 07:15:39 crc kubenswrapper[5127]: I0201 07:15:39.058348 5127 scope.go:117] "RemoveContainer" containerID="3488f73206b087a954bbef94f7cd739bdad5a3478cc5450889c6ba48c10a6d60" Feb 01 07:15:39 crc kubenswrapper[5127]: I0201 07:15:39.139366 5127 scope.go:117] "RemoveContainer" containerID="2dd5e354f57ec00475cc69e5f8c37183bd10c5dcd5115dcaf503de25d881f18b" Feb 01 07:15:39 crc kubenswrapper[5127]: I0201 07:15:39.173333 5127 scope.go:117] "RemoveContainer" containerID="12815831357c053ba52d5cd1f3f3dd72afee59a298bcd5c16a3985ed369f9e18" Feb 01 07:15:39 crc kubenswrapper[5127]: I0201 07:15:39.204415 5127 scope.go:117] "RemoveContainer" containerID="45488eefbe618c6ed70968bb3a79848f397c02da3176113bc9124b98acb538e2" Feb 01 07:15:39 crc kubenswrapper[5127]: I0201 07:15:39.222933 5127 scope.go:117] "RemoveContainer" containerID="25aceb3b56d27bad63c6a39a8a7c21031da417b927c23a72504a99b04f2dbf18" Feb 01 07:15:48 crc kubenswrapper[5127]: I0201 07:15:48.235961 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:15:48 crc kubenswrapper[5127]: E0201 07:15:48.237215 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:16:00 crc kubenswrapper[5127]: I0201 07:16:00.244822 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:16:00 crc kubenswrapper[5127]: E0201 07:16:00.246196 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:16:11 crc kubenswrapper[5127]: I0201 07:16:11.235856 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:16:11 crc kubenswrapper[5127]: E0201 07:16:11.238297 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:16:25 crc kubenswrapper[5127]: I0201 07:16:25.235715 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:16:25 crc kubenswrapper[5127]: E0201 07:16:25.236607 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.327351 5127 scope.go:117] "RemoveContainer" containerID="83c1670061335e3a90879e2fc206a129d30c726e80c1076a312eddd7f881625a" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.361774 5127 scope.go:117] "RemoveContainer" containerID="61e718b5841a9da16e8cc4920a23aae60828b65b752e22e97dd3965261d101ac" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.418061 5127 scope.go:117] "RemoveContainer" containerID="f98afb71f045a45fd856bbc1a5357077f7ba521a2f87d844ed202ca981f7c708" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.451647 5127 scope.go:117] "RemoveContainer" containerID="627b8c1220460e6b7ed1941227e8f20d5bd889a0bf5657636bc2ea4ab157629d" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.508889 5127 scope.go:117] "RemoveContainer" containerID="765723f0588cfac569cb1d2b34aaaf61f0d10551bffade767398b9e84a692d76" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.536505 5127 scope.go:117] "RemoveContainer" containerID="1b5a0ec763bf580aef8e9f8e8d7be9087c396c7cc6f58d1fd6b7ab61ac9d9f28" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.561308 5127 scope.go:117] "RemoveContainer" containerID="0be8d4cb9574063f87962b5663f7c99862b6167cbe906b2f8987098ff021beff" Feb 01 07:16:39 crc kubenswrapper[5127]: I0201 07:16:39.585925 5127 scope.go:117] "RemoveContainer" containerID="00489ed369f9be9fa5ae5086922bd8bea26af0611b892a20c80a5e1db18c8328" Feb 01 07:16:40 crc kubenswrapper[5127]: I0201 07:16:40.243546 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:16:40 crc kubenswrapper[5127]: E0201 07:16:40.244055 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:16:52 crc kubenswrapper[5127]: I0201 07:16:52.235786 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:16:52 crc kubenswrapper[5127]: E0201 07:16:52.236767 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:17:07 crc kubenswrapper[5127]: I0201 07:17:07.236141 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:17:07 crc kubenswrapper[5127]: E0201 07:17:07.237032 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:17:19 crc kubenswrapper[5127]: I0201 07:17:19.236234 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:17:19 crc kubenswrapper[5127]: E0201 07:17:19.237112 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:17:33 crc kubenswrapper[5127]: I0201 07:17:33.235122 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:17:33 crc kubenswrapper[5127]: E0201 07:17:33.235847 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:17:39 crc kubenswrapper[5127]: I0201 07:17:39.720148 5127 scope.go:117] "RemoveContainer" containerID="fdfa4791e619968d1933af4255d70dd183b5deedea313c73ca0881970b0dfbbb" Feb 01 07:17:39 crc kubenswrapper[5127]: I0201 07:17:39.786064 5127 scope.go:117] "RemoveContainer" containerID="63227caa383783fd418ece07b8dc86363707165b10731d1623e6cffade0f67b5" Feb 01 07:17:39 crc kubenswrapper[5127]: I0201 07:17:39.846900 5127 scope.go:117] "RemoveContainer" containerID="38f9be4585beef393b4454e51aeb832a07b82b8cb04999ea047e6c141c0ad22c" Feb 01 07:17:44 crc kubenswrapper[5127]: I0201 07:17:44.236213 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:17:44 crc kubenswrapper[5127]: E0201 07:17:44.238040 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:17:55 crc kubenswrapper[5127]: I0201 07:17:55.235481 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:17:55 crc kubenswrapper[5127]: E0201 07:17:55.236357 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:18:10 crc kubenswrapper[5127]: I0201 07:18:10.242982 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:18:10 crc kubenswrapper[5127]: E0201 07:18:10.244064 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:18:24 crc kubenswrapper[5127]: I0201 07:18:24.237477 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:18:24 crc kubenswrapper[5127]: E0201 07:18:24.238471 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:18:39 crc kubenswrapper[5127]: I0201 07:18:39.235290 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:18:39 crc kubenswrapper[5127]: E0201 07:18:39.236547 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:18:39 crc kubenswrapper[5127]: I0201 07:18:39.932190 5127 scope.go:117] "RemoveContainer" containerID="8cf8bfca617699b4935b3ba3f769e3af08c1e668d2cbc37132518f3e8740bc5f" Feb 01 07:18:50 crc kubenswrapper[5127]: I0201 07:18:50.243457 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:18:50 crc kubenswrapper[5127]: E0201 07:18:50.244470 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:19:01 crc kubenswrapper[5127]: I0201 07:19:01.236832 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:19:01 crc kubenswrapper[5127]: E0201 07:19:01.239145 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:19:15 crc kubenswrapper[5127]: I0201 07:19:15.236813 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:19:15 crc kubenswrapper[5127]: E0201 07:19:15.237973 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:19:27 crc kubenswrapper[5127]: I0201 07:19:27.235730 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:19:27 crc kubenswrapper[5127]: E0201 07:19:27.236645 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:19:42 crc kubenswrapper[5127]: I0201 07:19:42.236205 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:19:42 crc kubenswrapper[5127]: E0201 07:19:42.237210 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:19:53 crc kubenswrapper[5127]: I0201 07:19:53.235371 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:19:53 crc kubenswrapper[5127]: E0201 07:19:53.236442 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:20:04 crc kubenswrapper[5127]: I0201 07:20:04.236611 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:20:04 crc kubenswrapper[5127]: E0201 07:20:04.237539 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:20:19 crc kubenswrapper[5127]: I0201 07:20:19.236155 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:20:19 crc kubenswrapper[5127]: E0201 07:20:19.237235 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:20:34 crc kubenswrapper[5127]: I0201 07:20:34.238679 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:20:34 crc kubenswrapper[5127]: E0201 07:20:34.240072 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:20:45 crc kubenswrapper[5127]: I0201 07:20:45.236362 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:20:46 crc kubenswrapper[5127]: I0201 07:20:46.429103 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"d7875e7ac16e6d276da78b874d6b8538539993b4472a69ae78ff37eb93e4dd14"} Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.555280 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdn46"] Feb 01 07:22:01 crc kubenswrapper[5127]: E0201 07:22:01.556128 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" containerName="collect-profiles" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.556142 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" containerName="collect-profiles" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.556297 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" containerName="collect-profiles" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.557412 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.569729 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdn46"] Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.749842 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-utilities\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.750268 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-catalog-content\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.750309 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvc4b\" (UniqueName: \"kubernetes.io/projected/db868546-cf5f-4c94-886b-80a8c421e322-kube-api-access-lvc4b\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.851911 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-utilities\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.851987 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-catalog-content\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.852023 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvc4b\" (UniqueName: \"kubernetes.io/projected/db868546-cf5f-4c94-886b-80a8c421e322-kube-api-access-lvc4b\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.852618 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-catalog-content\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.852849 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-utilities\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.878561 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvc4b\" (UniqueName: \"kubernetes.io/projected/db868546-cf5f-4c94-886b-80a8c421e322-kube-api-access-lvc4b\") pod \"certified-operators-xdn46\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:01 crc kubenswrapper[5127]: I0201 07:22:01.893370 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:02 crc kubenswrapper[5127]: I0201 07:22:02.345811 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdn46"] Feb 01 07:22:03 crc kubenswrapper[5127]: I0201 07:22:03.104855 5127 generic.go:334] "Generic (PLEG): container finished" podID="db868546-cf5f-4c94-886b-80a8c421e322" containerID="32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0" exitCode=0 Feb 01 07:22:03 crc kubenswrapper[5127]: I0201 07:22:03.104959 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdn46" event={"ID":"db868546-cf5f-4c94-886b-80a8c421e322","Type":"ContainerDied","Data":"32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0"} Feb 01 07:22:03 crc kubenswrapper[5127]: I0201 07:22:03.105256 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdn46" event={"ID":"db868546-cf5f-4c94-886b-80a8c421e322","Type":"ContainerStarted","Data":"ece265e7a4c7c1e6e718449eaa86880ec7b5ffc5533ad1edd6dbf2bdfa348a8a"} Feb 01 07:22:03 crc kubenswrapper[5127]: I0201 07:22:03.109233 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:22:04 crc kubenswrapper[5127]: I0201 07:22:04.115716 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdn46" event={"ID":"db868546-cf5f-4c94-886b-80a8c421e322","Type":"ContainerStarted","Data":"0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d"} Feb 01 07:22:05 crc kubenswrapper[5127]: I0201 07:22:05.129425 5127 generic.go:334] "Generic (PLEG): container finished" podID="db868546-cf5f-4c94-886b-80a8c421e322" containerID="0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d" exitCode=0 Feb 01 07:22:05 crc kubenswrapper[5127]: I0201 07:22:05.129521 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdn46" event={"ID":"db868546-cf5f-4c94-886b-80a8c421e322","Type":"ContainerDied","Data":"0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d"} Feb 01 07:22:06 crc kubenswrapper[5127]: I0201 07:22:06.142320 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdn46" event={"ID":"db868546-cf5f-4c94-886b-80a8c421e322","Type":"ContainerStarted","Data":"c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be"} Feb 01 07:22:06 crc kubenswrapper[5127]: I0201 07:22:06.180884 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdn46" podStartSLOduration=2.74296068 podStartE2EDuration="5.180854146s" podCreationTimestamp="2026-02-01 07:22:01 +0000 UTC" firstStartedPulling="2026-02-01 07:22:03.108828077 +0000 UTC m=+2073.594730480" lastFinishedPulling="2026-02-01 07:22:05.546721543 +0000 UTC m=+2076.032623946" observedRunningTime="2026-02-01 07:22:06.174365711 +0000 UTC m=+2076.660268104" watchObservedRunningTime="2026-02-01 07:22:06.180854146 +0000 UTC m=+2076.666756539" Feb 01 07:22:11 crc kubenswrapper[5127]: I0201 07:22:11.894266 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:11 crc kubenswrapper[5127]: I0201 07:22:11.895149 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:11 crc kubenswrapper[5127]: I0201 07:22:11.974872 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:12 crc kubenswrapper[5127]: I0201 07:22:12.267585 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:12 crc kubenswrapper[5127]: I0201 07:22:12.340711 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdn46"] Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.213755 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdn46" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="registry-server" containerID="cri-o://c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be" gracePeriod=2 Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.737887 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.765301 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvc4b\" (UniqueName: \"kubernetes.io/projected/db868546-cf5f-4c94-886b-80a8c421e322-kube-api-access-lvc4b\") pod \"db868546-cf5f-4c94-886b-80a8c421e322\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.765758 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-catalog-content\") pod \"db868546-cf5f-4c94-886b-80a8c421e322\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.765839 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-utilities\") pod \"db868546-cf5f-4c94-886b-80a8c421e322\" (UID: \"db868546-cf5f-4c94-886b-80a8c421e322\") " Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.772306 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-utilities" (OuterVolumeSpecName: "utilities") pod "db868546-cf5f-4c94-886b-80a8c421e322" (UID: "db868546-cf5f-4c94-886b-80a8c421e322"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.776862 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db868546-cf5f-4c94-886b-80a8c421e322-kube-api-access-lvc4b" (OuterVolumeSpecName: "kube-api-access-lvc4b") pod "db868546-cf5f-4c94-886b-80a8c421e322" (UID: "db868546-cf5f-4c94-886b-80a8c421e322"). InnerVolumeSpecName "kube-api-access-lvc4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.847170 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db868546-cf5f-4c94-886b-80a8c421e322" (UID: "db868546-cf5f-4c94-886b-80a8c421e322"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.868702 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.868729 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db868546-cf5f-4c94-886b-80a8c421e322-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:22:14 crc kubenswrapper[5127]: I0201 07:22:14.868742 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvc4b\" (UniqueName: \"kubernetes.io/projected/db868546-cf5f-4c94-886b-80a8c421e322-kube-api-access-lvc4b\") on node \"crc\" DevicePath \"\"" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.226144 5127 generic.go:334] "Generic (PLEG): container finished" podID="db868546-cf5f-4c94-886b-80a8c421e322" containerID="c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be" exitCode=0 Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.226203 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdn46" event={"ID":"db868546-cf5f-4c94-886b-80a8c421e322","Type":"ContainerDied","Data":"c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be"} Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.226307 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdn46" event={"ID":"db868546-cf5f-4c94-886b-80a8c421e322","Type":"ContainerDied","Data":"ece265e7a4c7c1e6e718449eaa86880ec7b5ffc5533ad1edd6dbf2bdfa348a8a"} Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.226317 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdn46" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.226403 5127 scope.go:117] "RemoveContainer" containerID="c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.251276 5127 scope.go:117] "RemoveContainer" containerID="0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.291348 5127 scope.go:117] "RemoveContainer" containerID="32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.296191 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdn46"] Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.302956 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdn46"] Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.326228 5127 scope.go:117] "RemoveContainer" containerID="c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be" Feb 01 07:22:15 crc kubenswrapper[5127]: E0201 07:22:15.326929 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be\": container with ID starting with c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be not found: ID does not exist" containerID="c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.326998 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be"} err="failed to get container status \"c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be\": rpc error: code = NotFound desc = could not find container \"c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be\": container with ID starting with c78a435ca0c4d5bbd81e946a3e9d0c7ce75d97cab23abdc42eefb994453d21be not found: ID does not exist" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.327038 5127 scope.go:117] "RemoveContainer" containerID="0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d" Feb 01 07:22:15 crc kubenswrapper[5127]: E0201 07:22:15.327503 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d\": container with ID starting with 0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d not found: ID does not exist" containerID="0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.327535 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d"} err="failed to get container status \"0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d\": rpc error: code = NotFound desc = could not find container \"0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d\": container with ID starting with 0b29f582416f99098199bf8de62d85edbf9cc69f1afbeda637a6a587b6aaf64d not found: ID does not exist" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.327555 5127 scope.go:117] "RemoveContainer" containerID="32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0" Feb 01 07:22:15 crc kubenswrapper[5127]: E0201 07:22:15.328274 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0\": container with ID starting with 32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0 not found: ID does not exist" containerID="32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0" Feb 01 07:22:15 crc kubenswrapper[5127]: I0201 07:22:15.328324 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0"} err="failed to get container status \"32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0\": rpc error: code = NotFound desc = could not find container \"32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0\": container with ID starting with 32e126ac3c7fac00ee9f9de0b0eb7a8f99a157d0d6de44d5ff12b2a2301989c0 not found: ID does not exist" Feb 01 07:22:16 crc kubenswrapper[5127]: I0201 07:22:16.250035 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db868546-cf5f-4c94-886b-80a8c421e322" path="/var/lib/kubelet/pods/db868546-cf5f-4c94-886b-80a8c421e322/volumes" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.243515 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4z6j"] Feb 01 07:22:53 crc kubenswrapper[5127]: E0201 07:22:53.244511 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="extract-utilities" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.244526 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="extract-utilities" Feb 01 07:22:53 crc kubenswrapper[5127]: E0201 07:22:53.244553 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="registry-server" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.244561 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="registry-server" Feb 01 07:22:53 crc kubenswrapper[5127]: E0201 07:22:53.244574 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="extract-content" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.244600 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="extract-content" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.244794 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="db868546-cf5f-4c94-886b-80a8c421e322" containerName="registry-server" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.246019 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.257280 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4z6j"] Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.443867 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-catalog-content\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.444004 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4slrn\" (UniqueName: \"kubernetes.io/projected/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-kube-api-access-4slrn\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.444147 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-utilities\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.545294 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-catalog-content\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.545428 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4slrn\" (UniqueName: \"kubernetes.io/projected/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-kube-api-access-4slrn\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.545546 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-utilities\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.546153 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-catalog-content\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.546548 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-utilities\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.575192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4slrn\" (UniqueName: \"kubernetes.io/projected/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-kube-api-access-4slrn\") pod \"redhat-operators-n4z6j\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:53 crc kubenswrapper[5127]: I0201 07:22:53.867925 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:22:54 crc kubenswrapper[5127]: I0201 07:22:54.303304 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4z6j"] Feb 01 07:22:54 crc kubenswrapper[5127]: I0201 07:22:54.587707 5127 generic.go:334] "Generic (PLEG): container finished" podID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerID="7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10" exitCode=0 Feb 01 07:22:54 crc kubenswrapper[5127]: I0201 07:22:54.587757 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z6j" event={"ID":"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8","Type":"ContainerDied","Data":"7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10"} Feb 01 07:22:54 crc kubenswrapper[5127]: I0201 07:22:54.587789 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z6j" event={"ID":"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8","Type":"ContainerStarted","Data":"81ad5e94bd7a59c210493d6cf23724da841684f3b697a9f94c6cd8d8ac99f88e"} Feb 01 07:22:55 crc kubenswrapper[5127]: I0201 07:22:55.599802 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z6j" event={"ID":"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8","Type":"ContainerStarted","Data":"664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9"} Feb 01 07:22:56 crc kubenswrapper[5127]: I0201 07:22:56.617600 5127 generic.go:334] "Generic (PLEG): container finished" podID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerID="664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9" exitCode=0 Feb 01 07:22:56 crc kubenswrapper[5127]: I0201 07:22:56.617653 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z6j" event={"ID":"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8","Type":"ContainerDied","Data":"664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9"} Feb 01 07:22:57 crc kubenswrapper[5127]: I0201 07:22:57.629675 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z6j" event={"ID":"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8","Type":"ContainerStarted","Data":"82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f"} Feb 01 07:22:57 crc kubenswrapper[5127]: I0201 07:22:57.659746 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4z6j" podStartSLOduration=2.217022681 podStartE2EDuration="4.659711595s" podCreationTimestamp="2026-02-01 07:22:53 +0000 UTC" firstStartedPulling="2026-02-01 07:22:54.589023333 +0000 UTC m=+2125.074925696" lastFinishedPulling="2026-02-01 07:22:57.031712217 +0000 UTC m=+2127.517614610" observedRunningTime="2026-02-01 07:22:57.656246681 +0000 UTC m=+2128.142149084" watchObservedRunningTime="2026-02-01 07:22:57.659711595 +0000 UTC m=+2128.145613998" Feb 01 07:23:03 crc kubenswrapper[5127]: I0201 07:23:03.869125 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:23:03 crc kubenswrapper[5127]: I0201 07:23:03.869778 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:23:04 crc kubenswrapper[5127]: I0201 07:23:04.959416 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n4z6j" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="registry-server" probeResult="failure" output=< Feb 01 07:23:04 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 07:23:04 crc kubenswrapper[5127]: > Feb 01 07:23:06 crc kubenswrapper[5127]: I0201 07:23:06.740661 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:23:06 crc kubenswrapper[5127]: I0201 07:23:06.741048 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:23:13 crc kubenswrapper[5127]: I0201 07:23:13.946937 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:23:14 crc kubenswrapper[5127]: I0201 07:23:14.016192 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:23:14 crc kubenswrapper[5127]: I0201 07:23:14.195533 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4z6j"] Feb 01 07:23:15 crc kubenswrapper[5127]: I0201 07:23:15.798634 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4z6j" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="registry-server" containerID="cri-o://82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f" gracePeriod=2 Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.297689 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.413687 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4slrn\" (UniqueName: \"kubernetes.io/projected/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-kube-api-access-4slrn\") pod \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.413873 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-catalog-content\") pod \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.413927 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-utilities\") pod \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\" (UID: \"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8\") " Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.415783 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-utilities" (OuterVolumeSpecName: "utilities") pod "5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" (UID: "5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.419717 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-kube-api-access-4slrn" (OuterVolumeSpecName: "kube-api-access-4slrn") pod "5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" (UID: "5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8"). InnerVolumeSpecName "kube-api-access-4slrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.517096 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.517231 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4slrn\" (UniqueName: \"kubernetes.io/projected/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-kube-api-access-4slrn\") on node \"crc\" DevicePath \"\"" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.562732 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" (UID: "5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.618471 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.809789 5127 generic.go:334] "Generic (PLEG): container finished" podID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerID="82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f" exitCode=0 Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.809891 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z6j" event={"ID":"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8","Type":"ContainerDied","Data":"82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f"} Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.809933 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4z6j" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.810858 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4z6j" event={"ID":"5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8","Type":"ContainerDied","Data":"81ad5e94bd7a59c210493d6cf23724da841684f3b697a9f94c6cd8d8ac99f88e"} Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.810906 5127 scope.go:117] "RemoveContainer" containerID="82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.838549 5127 scope.go:117] "RemoveContainer" containerID="664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.874275 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4z6j"] Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.879792 5127 scope.go:117] "RemoveContainer" containerID="7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.885235 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4z6j"] Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.908493 5127 scope.go:117] "RemoveContainer" containerID="82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f" Feb 01 07:23:16 crc kubenswrapper[5127]: E0201 07:23:16.909331 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f\": container with ID starting with 82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f not found: ID does not exist" containerID="82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.909398 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f"} err="failed to get container status \"82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f\": rpc error: code = NotFound desc = could not find container \"82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f\": container with ID starting with 82139e72dc5ba8c6e950785d76ba5f334e04f7fda4b12de990203cc12c94b10f not found: ID does not exist" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.909446 5127 scope.go:117] "RemoveContainer" containerID="664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9" Feb 01 07:23:16 crc kubenswrapper[5127]: E0201 07:23:16.910227 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9\": container with ID starting with 664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9 not found: ID does not exist" containerID="664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.910274 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9"} err="failed to get container status \"664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9\": rpc error: code = NotFound desc = could not find container \"664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9\": container with ID starting with 664a60abef83cefc035bd7e482f875d6c4d2309532a1dcbaa6184ea3f1f7eef9 not found: ID does not exist" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.910302 5127 scope.go:117] "RemoveContainer" containerID="7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10" Feb 01 07:23:16 crc kubenswrapper[5127]: E0201 07:23:16.910935 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10\": container with ID starting with 7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10 not found: ID does not exist" containerID="7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10" Feb 01 07:23:16 crc kubenswrapper[5127]: I0201 07:23:16.910976 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10"} err="failed to get container status \"7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10\": rpc error: code = NotFound desc = could not find container \"7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10\": container with ID starting with 7d795ce3c229be92b9ff5c9a53c9287978b0e1fbd4f625192af92e62eef5db10 not found: ID does not exist" Feb 01 07:23:18 crc kubenswrapper[5127]: I0201 07:23:18.246922 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" path="/var/lib/kubelet/pods/5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8/volumes" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.308407 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wzcrt"] Feb 01 07:23:30 crc kubenswrapper[5127]: E0201 07:23:30.311338 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="extract-utilities" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.311362 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="extract-utilities" Feb 01 07:23:30 crc kubenswrapper[5127]: E0201 07:23:30.311386 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="extract-content" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.311394 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="extract-content" Feb 01 07:23:30 crc kubenswrapper[5127]: E0201 07:23:30.311403 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="registry-server" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.311411 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="registry-server" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.311570 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab06a9c-4b8a-4bc4-82c7-01d4bb0353f8" containerName="registry-server" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.313076 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.325204 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzcrt"] Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.480143 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-utilities\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.480229 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-catalog-content\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.480309 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/e4b21619-2f78-478a-b513-a25e70d7f609-kube-api-access-gckmd\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.582339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-utilities\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.582552 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-catalog-content\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.582870 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/e4b21619-2f78-478a-b513-a25e70d7f609-kube-api-access-gckmd\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.583168 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-utilities\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.583417 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-catalog-content\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.614506 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/e4b21619-2f78-478a-b513-a25e70d7f609-kube-api-access-gckmd\") pod \"redhat-marketplace-wzcrt\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:30 crc kubenswrapper[5127]: I0201 07:23:30.630512 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:31 crc kubenswrapper[5127]: I0201 07:23:31.117743 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzcrt"] Feb 01 07:23:31 crc kubenswrapper[5127]: I0201 07:23:31.953021 5127 generic.go:334] "Generic (PLEG): container finished" podID="e4b21619-2f78-478a-b513-a25e70d7f609" containerID="5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab" exitCode=0 Feb 01 07:23:31 crc kubenswrapper[5127]: I0201 07:23:31.953072 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzcrt" event={"ID":"e4b21619-2f78-478a-b513-a25e70d7f609","Type":"ContainerDied","Data":"5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab"} Feb 01 07:23:31 crc kubenswrapper[5127]: I0201 07:23:31.953354 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzcrt" event={"ID":"e4b21619-2f78-478a-b513-a25e70d7f609","Type":"ContainerStarted","Data":"607eaf99cb005b4b7fa7252ff499dd4458f87b849f0b45dfc252634e71a35b8e"} Feb 01 07:23:32 crc kubenswrapper[5127]: I0201 07:23:32.963851 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzcrt" event={"ID":"e4b21619-2f78-478a-b513-a25e70d7f609","Type":"ContainerStarted","Data":"c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b"} Feb 01 07:23:33 crc kubenswrapper[5127]: E0201 07:23:33.086230 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b21619_2f78_478a_b513_a25e70d7f609.slice/crio-c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b.scope\": RecentStats: unable to find data in memory cache]" Feb 01 07:23:33 crc kubenswrapper[5127]: I0201 07:23:33.978010 5127 generic.go:334] "Generic (PLEG): container finished" podID="e4b21619-2f78-478a-b513-a25e70d7f609" containerID="c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b" exitCode=0 Feb 01 07:23:33 crc kubenswrapper[5127]: I0201 07:23:33.978199 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzcrt" event={"ID":"e4b21619-2f78-478a-b513-a25e70d7f609","Type":"ContainerDied","Data":"c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b"} Feb 01 07:23:34 crc kubenswrapper[5127]: I0201 07:23:34.990030 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzcrt" event={"ID":"e4b21619-2f78-478a-b513-a25e70d7f609","Type":"ContainerStarted","Data":"6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890"} Feb 01 07:23:35 crc kubenswrapper[5127]: I0201 07:23:35.025836 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wzcrt" podStartSLOduration=2.531425071 podStartE2EDuration="5.02580317s" podCreationTimestamp="2026-02-01 07:23:30 +0000 UTC" firstStartedPulling="2026-02-01 07:23:31.954870729 +0000 UTC m=+2162.440773082" lastFinishedPulling="2026-02-01 07:23:34.449248768 +0000 UTC m=+2164.935151181" observedRunningTime="2026-02-01 07:23:35.012023658 +0000 UTC m=+2165.497926061" watchObservedRunningTime="2026-02-01 07:23:35.02580317 +0000 UTC m=+2165.511705573" Feb 01 07:23:36 crc kubenswrapper[5127]: I0201 07:23:36.741195 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:23:36 crc kubenswrapper[5127]: I0201 07:23:36.741260 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:23:40 crc kubenswrapper[5127]: I0201 07:23:40.631778 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:40 crc kubenswrapper[5127]: I0201 07:23:40.634800 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:40 crc kubenswrapper[5127]: I0201 07:23:40.724255 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:41 crc kubenswrapper[5127]: I0201 07:23:41.097031 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.187196 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzcrt"] Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.188004 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wzcrt" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="registry-server" containerID="cri-o://6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890" gracePeriod=2 Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.723647 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.816044 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-utilities\") pod \"e4b21619-2f78-478a-b513-a25e70d7f609\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.816117 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-catalog-content\") pod \"e4b21619-2f78-478a-b513-a25e70d7f609\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.816206 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/e4b21619-2f78-478a-b513-a25e70d7f609-kube-api-access-gckmd\") pod \"e4b21619-2f78-478a-b513-a25e70d7f609\" (UID: \"e4b21619-2f78-478a-b513-a25e70d7f609\") " Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.817170 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-utilities" (OuterVolumeSpecName: "utilities") pod "e4b21619-2f78-478a-b513-a25e70d7f609" (UID: "e4b21619-2f78-478a-b513-a25e70d7f609"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.822770 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b21619-2f78-478a-b513-a25e70d7f609-kube-api-access-gckmd" (OuterVolumeSpecName: "kube-api-access-gckmd") pod "e4b21619-2f78-478a-b513-a25e70d7f609" (UID: "e4b21619-2f78-478a-b513-a25e70d7f609"). InnerVolumeSpecName "kube-api-access-gckmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.843914 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4b21619-2f78-478a-b513-a25e70d7f609" (UID: "e4b21619-2f78-478a-b513-a25e70d7f609"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.918392 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/e4b21619-2f78-478a-b513-a25e70d7f609-kube-api-access-gckmd\") on node \"crc\" DevicePath \"\"" Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.918443 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:23:44 crc kubenswrapper[5127]: I0201 07:23:44.918464 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4b21619-2f78-478a-b513-a25e70d7f609-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.084202 5127 generic.go:334] "Generic (PLEG): container finished" podID="e4b21619-2f78-478a-b513-a25e70d7f609" containerID="6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890" exitCode=0 Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.084292 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzcrt" event={"ID":"e4b21619-2f78-478a-b513-a25e70d7f609","Type":"ContainerDied","Data":"6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890"} Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.084389 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wzcrt" event={"ID":"e4b21619-2f78-478a-b513-a25e70d7f609","Type":"ContainerDied","Data":"607eaf99cb005b4b7fa7252ff499dd4458f87b849f0b45dfc252634e71a35b8e"} Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.084424 5127 scope.go:117] "RemoveContainer" containerID="6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.084510 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wzcrt" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.115574 5127 scope.go:117] "RemoveContainer" containerID="c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.151551 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzcrt"] Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.156109 5127 scope.go:117] "RemoveContainer" containerID="5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.165172 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wzcrt"] Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.187329 5127 scope.go:117] "RemoveContainer" containerID="6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890" Feb 01 07:23:45 crc kubenswrapper[5127]: E0201 07:23:45.188128 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890\": container with ID starting with 6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890 not found: ID does not exist" containerID="6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.188229 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890"} err="failed to get container status \"6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890\": rpc error: code = NotFound desc = could not find container \"6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890\": container with ID starting with 6ee8188c798dcd6d6c9f878da43f6d1754ed286a222ec783b99fd0524e9db890 not found: ID does not exist" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.188307 5127 scope.go:117] "RemoveContainer" containerID="c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b" Feb 01 07:23:45 crc kubenswrapper[5127]: E0201 07:23:45.189182 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b\": container with ID starting with c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b not found: ID does not exist" containerID="c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.189264 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b"} err="failed to get container status \"c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b\": rpc error: code = NotFound desc = could not find container \"c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b\": container with ID starting with c404c42fb468a6be407e574387f97b85df56488a78d45dd5588905ed1a21e17b not found: ID does not exist" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.189298 5127 scope.go:117] "RemoveContainer" containerID="5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab" Feb 01 07:23:45 crc kubenswrapper[5127]: E0201 07:23:45.189882 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab\": container with ID starting with 5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab not found: ID does not exist" containerID="5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab" Feb 01 07:23:45 crc kubenswrapper[5127]: I0201 07:23:45.189949 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab"} err="failed to get container status \"5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab\": rpc error: code = NotFound desc = could not find container \"5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab\": container with ID starting with 5814a04fa60aea0704b09d19842d596b9cec5e80f9e13b4a9bfaccf47063bcab not found: ID does not exist" Feb 01 07:23:46 crc kubenswrapper[5127]: I0201 07:23:46.250197 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" path="/var/lib/kubelet/pods/e4b21619-2f78-478a-b513-a25e70d7f609/volumes" Feb 01 07:24:06 crc kubenswrapper[5127]: I0201 07:24:06.741461 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:24:06 crc kubenswrapper[5127]: I0201 07:24:06.742290 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:24:06 crc kubenswrapper[5127]: I0201 07:24:06.742356 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:24:06 crc kubenswrapper[5127]: I0201 07:24:06.743652 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7875e7ac16e6d276da78b874d6b8538539993b4472a69ae78ff37eb93e4dd14"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:24:06 crc kubenswrapper[5127]: I0201 07:24:06.743753 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://d7875e7ac16e6d276da78b874d6b8538539993b4472a69ae78ff37eb93e4dd14" gracePeriod=600 Feb 01 07:24:07 crc kubenswrapper[5127]: I0201 07:24:07.326987 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="d7875e7ac16e6d276da78b874d6b8538539993b4472a69ae78ff37eb93e4dd14" exitCode=0 Feb 01 07:24:07 crc kubenswrapper[5127]: I0201 07:24:07.327082 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"d7875e7ac16e6d276da78b874d6b8538539993b4472a69ae78ff37eb93e4dd14"} Feb 01 07:24:07 crc kubenswrapper[5127]: I0201 07:24:07.327523 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5"} Feb 01 07:24:07 crc kubenswrapper[5127]: I0201 07:24:07.327568 5127 scope.go:117] "RemoveContainer" containerID="c86f6cd98df81b344ad7290fa3f862160bfd88b66f27e16acc7a1e0fc01af03e" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.360796 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tzshh"] Feb 01 07:24:24 crc kubenswrapper[5127]: E0201 07:24:24.361719 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="extract-content" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.361735 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="extract-content" Feb 01 07:24:24 crc kubenswrapper[5127]: E0201 07:24:24.361759 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="registry-server" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.361768 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="registry-server" Feb 01 07:24:24 crc kubenswrapper[5127]: E0201 07:24:24.361785 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="extract-utilities" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.361793 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="extract-utilities" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.361955 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b21619-2f78-478a-b513-a25e70d7f609" containerName="registry-server" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.363126 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.382995 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tzshh"] Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.481758 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cggb2\" (UniqueName: \"kubernetes.io/projected/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-kube-api-access-cggb2\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.482015 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-utilities\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.482037 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-catalog-content\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.583418 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cggb2\" (UniqueName: \"kubernetes.io/projected/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-kube-api-access-cggb2\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.583947 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-utilities\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.584487 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-catalog-content\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.584777 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-utilities\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.584823 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-catalog-content\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.605188 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cggb2\" (UniqueName: \"kubernetes.io/projected/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-kube-api-access-cggb2\") pod \"community-operators-tzshh\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:24 crc kubenswrapper[5127]: I0201 07:24:24.699854 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:25 crc kubenswrapper[5127]: I0201 07:24:25.193186 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tzshh"] Feb 01 07:24:26 crc kubenswrapper[5127]: I0201 07:24:26.222387 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzshh" event={"ID":"f29cf59d-6a2a-4cd2-81e7-2be1694b1002","Type":"ContainerStarted","Data":"e87d502e8b561475839f99abfeffa602f02c3f52f675d46033fe2c9f3b58feb6"} Feb 01 07:24:27 crc kubenswrapper[5127]: I0201 07:24:27.235391 5127 generic.go:334] "Generic (PLEG): container finished" podID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerID="1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883" exitCode=0 Feb 01 07:24:27 crc kubenswrapper[5127]: I0201 07:24:27.235541 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzshh" event={"ID":"f29cf59d-6a2a-4cd2-81e7-2be1694b1002","Type":"ContainerDied","Data":"1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883"} Feb 01 07:24:28 crc kubenswrapper[5127]: I0201 07:24:28.251936 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzshh" event={"ID":"f29cf59d-6a2a-4cd2-81e7-2be1694b1002","Type":"ContainerStarted","Data":"893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea"} Feb 01 07:24:29 crc kubenswrapper[5127]: I0201 07:24:29.262617 5127 generic.go:334] "Generic (PLEG): container finished" podID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerID="893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea" exitCode=0 Feb 01 07:24:29 crc kubenswrapper[5127]: I0201 07:24:29.263061 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzshh" event={"ID":"f29cf59d-6a2a-4cd2-81e7-2be1694b1002","Type":"ContainerDied","Data":"893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea"} Feb 01 07:24:30 crc kubenswrapper[5127]: I0201 07:24:30.288096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzshh" event={"ID":"f29cf59d-6a2a-4cd2-81e7-2be1694b1002","Type":"ContainerStarted","Data":"cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372"} Feb 01 07:24:30 crc kubenswrapper[5127]: I0201 07:24:30.320569 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tzshh" podStartSLOduration=3.9233280390000003 podStartE2EDuration="6.320544388s" podCreationTimestamp="2026-02-01 07:24:24 +0000 UTC" firstStartedPulling="2026-02-01 07:24:27.238097906 +0000 UTC m=+2217.724000309" lastFinishedPulling="2026-02-01 07:24:29.635314255 +0000 UTC m=+2220.121216658" observedRunningTime="2026-02-01 07:24:30.317441254 +0000 UTC m=+2220.803343707" watchObservedRunningTime="2026-02-01 07:24:30.320544388 +0000 UTC m=+2220.806446791" Feb 01 07:24:34 crc kubenswrapper[5127]: I0201 07:24:34.700015 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:34 crc kubenswrapper[5127]: I0201 07:24:34.700731 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:34 crc kubenswrapper[5127]: I0201 07:24:34.774823 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:35 crc kubenswrapper[5127]: I0201 07:24:35.414249 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:35 crc kubenswrapper[5127]: I0201 07:24:35.482897 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tzshh"] Feb 01 07:24:37 crc kubenswrapper[5127]: I0201 07:24:37.354214 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tzshh" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="registry-server" containerID="cri-o://cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372" gracePeriod=2 Feb 01 07:24:37 crc kubenswrapper[5127]: I0201 07:24:37.881105 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:37 crc kubenswrapper[5127]: I0201 07:24:37.994392 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cggb2\" (UniqueName: \"kubernetes.io/projected/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-kube-api-access-cggb2\") pod \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " Feb 01 07:24:37 crc kubenswrapper[5127]: I0201 07:24:37.994890 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-utilities\") pod \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " Feb 01 07:24:37 crc kubenswrapper[5127]: I0201 07:24:37.994977 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-catalog-content\") pod \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\" (UID: \"f29cf59d-6a2a-4cd2-81e7-2be1694b1002\") " Feb 01 07:24:37 crc kubenswrapper[5127]: I0201 07:24:37.995960 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-utilities" (OuterVolumeSpecName: "utilities") pod "f29cf59d-6a2a-4cd2-81e7-2be1694b1002" (UID: "f29cf59d-6a2a-4cd2-81e7-2be1694b1002"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.005892 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-kube-api-access-cggb2" (OuterVolumeSpecName: "kube-api-access-cggb2") pod "f29cf59d-6a2a-4cd2-81e7-2be1694b1002" (UID: "f29cf59d-6a2a-4cd2-81e7-2be1694b1002"). InnerVolumeSpecName "kube-api-access-cggb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.089437 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f29cf59d-6a2a-4cd2-81e7-2be1694b1002" (UID: "f29cf59d-6a2a-4cd2-81e7-2be1694b1002"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.096861 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.096890 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cggb2\" (UniqueName: \"kubernetes.io/projected/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-kube-api-access-cggb2\") on node \"crc\" DevicePath \"\"" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.096925 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f29cf59d-6a2a-4cd2-81e7-2be1694b1002-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.372461 5127 generic.go:334] "Generic (PLEG): container finished" podID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerID="cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372" exitCode=0 Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.372508 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzshh" event={"ID":"f29cf59d-6a2a-4cd2-81e7-2be1694b1002","Type":"ContainerDied","Data":"cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372"} Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.372536 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tzshh" event={"ID":"f29cf59d-6a2a-4cd2-81e7-2be1694b1002","Type":"ContainerDied","Data":"e87d502e8b561475839f99abfeffa602f02c3f52f675d46033fe2c9f3b58feb6"} Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.372553 5127 scope.go:117] "RemoveContainer" containerID="cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.372555 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tzshh" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.405436 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tzshh"] Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.412567 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tzshh"] Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.420152 5127 scope.go:117] "RemoveContainer" containerID="893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.448905 5127 scope.go:117] "RemoveContainer" containerID="1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.471016 5127 scope.go:117] "RemoveContainer" containerID="cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372" Feb 01 07:24:38 crc kubenswrapper[5127]: E0201 07:24:38.471491 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372\": container with ID starting with cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372 not found: ID does not exist" containerID="cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.471563 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372"} err="failed to get container status \"cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372\": rpc error: code = NotFound desc = could not find container \"cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372\": container with ID starting with cb0d6cd6532ea78cbe61da2eb69c2a7b8de1fe137e0c661a8d3c82134fd1f372 not found: ID does not exist" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.471630 5127 scope.go:117] "RemoveContainer" containerID="893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea" Feb 01 07:24:38 crc kubenswrapper[5127]: E0201 07:24:38.472044 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea\": container with ID starting with 893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea not found: ID does not exist" containerID="893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.472094 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea"} err="failed to get container status \"893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea\": rpc error: code = NotFound desc = could not find container \"893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea\": container with ID starting with 893df32edd46b7c90b81b58c82ad1cb28bd22d6e94d71c5beebe3b1ee9ae52ea not found: ID does not exist" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.472124 5127 scope.go:117] "RemoveContainer" containerID="1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883" Feb 01 07:24:38 crc kubenswrapper[5127]: E0201 07:24:38.472730 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883\": container with ID starting with 1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883 not found: ID does not exist" containerID="1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883" Feb 01 07:24:38 crc kubenswrapper[5127]: I0201 07:24:38.472788 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883"} err="failed to get container status \"1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883\": rpc error: code = NotFound desc = could not find container \"1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883\": container with ID starting with 1e7bebd01eae04141b3144004c3f168560c7a3c4d66a4c367147d0fa136f9883 not found: ID does not exist" Feb 01 07:24:40 crc kubenswrapper[5127]: I0201 07:24:40.251125 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" path="/var/lib/kubelet/pods/f29cf59d-6a2a-4cd2-81e7-2be1694b1002/volumes" Feb 01 07:26:36 crc kubenswrapper[5127]: I0201 07:26:36.741324 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:26:36 crc kubenswrapper[5127]: I0201 07:26:36.742176 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:27:06 crc kubenswrapper[5127]: I0201 07:27:06.741266 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:27:06 crc kubenswrapper[5127]: I0201 07:27:06.741994 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:27:36 crc kubenswrapper[5127]: I0201 07:27:36.740511 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:27:36 crc kubenswrapper[5127]: I0201 07:27:36.741245 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:27:36 crc kubenswrapper[5127]: I0201 07:27:36.741316 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:27:36 crc kubenswrapper[5127]: I0201 07:27:36.742331 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:27:36 crc kubenswrapper[5127]: I0201 07:27:36.742462 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" gracePeriod=600 Feb 01 07:27:36 crc kubenswrapper[5127]: E0201 07:27:36.892968 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:27:37 crc kubenswrapper[5127]: I0201 07:27:37.084273 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" exitCode=0 Feb 01 07:27:37 crc kubenswrapper[5127]: I0201 07:27:37.084746 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5"} Feb 01 07:27:37 crc kubenswrapper[5127]: I0201 07:27:37.085061 5127 scope.go:117] "RemoveContainer" containerID="d7875e7ac16e6d276da78b874d6b8538539993b4472a69ae78ff37eb93e4dd14" Feb 01 07:27:37 crc kubenswrapper[5127]: I0201 07:27:37.085665 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:27:37 crc kubenswrapper[5127]: E0201 07:27:37.086048 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:27:49 crc kubenswrapper[5127]: I0201 07:27:49.236336 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:27:49 crc kubenswrapper[5127]: E0201 07:27:49.237723 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:28:02 crc kubenswrapper[5127]: I0201 07:28:02.236551 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:28:02 crc kubenswrapper[5127]: E0201 07:28:02.237894 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:28:13 crc kubenswrapper[5127]: I0201 07:28:13.235970 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:28:13 crc kubenswrapper[5127]: E0201 07:28:13.238402 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:28:28 crc kubenswrapper[5127]: I0201 07:28:28.236734 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:28:28 crc kubenswrapper[5127]: E0201 07:28:28.237795 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:28:40 crc kubenswrapper[5127]: I0201 07:28:40.243920 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:28:40 crc kubenswrapper[5127]: E0201 07:28:40.244719 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:28:51 crc kubenswrapper[5127]: I0201 07:28:51.235493 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:28:51 crc kubenswrapper[5127]: E0201 07:28:51.238412 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:29:05 crc kubenswrapper[5127]: I0201 07:29:05.236150 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:29:05 crc kubenswrapper[5127]: E0201 07:29:05.237203 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:29:19 crc kubenswrapper[5127]: I0201 07:29:19.237011 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:29:19 crc kubenswrapper[5127]: E0201 07:29:19.238052 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:29:34 crc kubenswrapper[5127]: I0201 07:29:34.238984 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:29:34 crc kubenswrapper[5127]: E0201 07:29:34.240320 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:29:47 crc kubenswrapper[5127]: I0201 07:29:47.236024 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:29:47 crc kubenswrapper[5127]: E0201 07:29:47.237134 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:29:58 crc kubenswrapper[5127]: I0201 07:29:58.236136 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:29:58 crc kubenswrapper[5127]: E0201 07:29:58.237069 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.168393 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v"] Feb 01 07:30:00 crc kubenswrapper[5127]: E0201 07:30:00.169245 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="registry-server" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.169270 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="registry-server" Feb 01 07:30:00 crc kubenswrapper[5127]: E0201 07:30:00.169303 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="extract-content" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.169317 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="extract-content" Feb 01 07:30:00 crc kubenswrapper[5127]: E0201 07:30:00.169348 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="extract-utilities" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.169363 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="extract-utilities" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.169681 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29cf59d-6a2a-4cd2-81e7-2be1694b1002" containerName="registry-server" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.170835 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.174684 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.184669 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.199807 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v"] Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.256248 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d7dee2c-f497-4ddf-89ef-19b5f937965b-config-volume\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.256312 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d7dee2c-f497-4ddf-89ef-19b5f937965b-secret-volume\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.256367 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzpv\" (UniqueName: \"kubernetes.io/projected/7d7dee2c-f497-4ddf-89ef-19b5f937965b-kube-api-access-wxzpv\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.357777 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzpv\" (UniqueName: \"kubernetes.io/projected/7d7dee2c-f497-4ddf-89ef-19b5f937965b-kube-api-access-wxzpv\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.358092 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d7dee2c-f497-4ddf-89ef-19b5f937965b-config-volume\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.358183 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d7dee2c-f497-4ddf-89ef-19b5f937965b-secret-volume\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.361025 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d7dee2c-f497-4ddf-89ef-19b5f937965b-config-volume\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.372151 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d7dee2c-f497-4ddf-89ef-19b5f937965b-secret-volume\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.376568 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzpv\" (UniqueName: \"kubernetes.io/projected/7d7dee2c-f497-4ddf-89ef-19b5f937965b-kube-api-access-wxzpv\") pod \"collect-profiles-29498850-scw7v\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.501221 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:00 crc kubenswrapper[5127]: I0201 07:30:00.969672 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v"] Feb 01 07:30:00 crc kubenswrapper[5127]: W0201 07:30:00.977479 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7dee2c_f497_4ddf_89ef_19b5f937965b.slice/crio-762b691a922ffc8155708896024e87448e39e656b5e5f59af6112988a34328d9 WatchSource:0}: Error finding container 762b691a922ffc8155708896024e87448e39e656b5e5f59af6112988a34328d9: Status 404 returned error can't find the container with id 762b691a922ffc8155708896024e87448e39e656b5e5f59af6112988a34328d9 Feb 01 07:30:01 crc kubenswrapper[5127]: I0201 07:30:01.463545 5127 generic.go:334] "Generic (PLEG): container finished" podID="7d7dee2c-f497-4ddf-89ef-19b5f937965b" containerID="c66b1c6cc7ad86059ba5529ee83a63ed8099ff594661045d34942aa7997b3242" exitCode=0 Feb 01 07:30:01 crc kubenswrapper[5127]: I0201 07:30:01.463799 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" event={"ID":"7d7dee2c-f497-4ddf-89ef-19b5f937965b","Type":"ContainerDied","Data":"c66b1c6cc7ad86059ba5529ee83a63ed8099ff594661045d34942aa7997b3242"} Feb 01 07:30:01 crc kubenswrapper[5127]: I0201 07:30:01.463900 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" event={"ID":"7d7dee2c-f497-4ddf-89ef-19b5f937965b","Type":"ContainerStarted","Data":"762b691a922ffc8155708896024e87448e39e656b5e5f59af6112988a34328d9"} Feb 01 07:30:01 crc kubenswrapper[5127]: E0201 07:30:01.512859 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7dee2c_f497_4ddf_89ef_19b5f937965b.slice/crio-conmon-c66b1c6cc7ad86059ba5529ee83a63ed8099ff594661045d34942aa7997b3242.scope\": RecentStats: unable to find data in memory cache]" Feb 01 07:30:02 crc kubenswrapper[5127]: I0201 07:30:02.821053 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.000146 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxzpv\" (UniqueName: \"kubernetes.io/projected/7d7dee2c-f497-4ddf-89ef-19b5f937965b-kube-api-access-wxzpv\") pod \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.000301 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d7dee2c-f497-4ddf-89ef-19b5f937965b-secret-volume\") pod \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.000432 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d7dee2c-f497-4ddf-89ef-19b5f937965b-config-volume\") pod \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\" (UID: \"7d7dee2c-f497-4ddf-89ef-19b5f937965b\") " Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.001732 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d7dee2c-f497-4ddf-89ef-19b5f937965b-config-volume" (OuterVolumeSpecName: "config-volume") pod "7d7dee2c-f497-4ddf-89ef-19b5f937965b" (UID: "7d7dee2c-f497-4ddf-89ef-19b5f937965b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.006785 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7dee2c-f497-4ddf-89ef-19b5f937965b-kube-api-access-wxzpv" (OuterVolumeSpecName: "kube-api-access-wxzpv") pod "7d7dee2c-f497-4ddf-89ef-19b5f937965b" (UID: "7d7dee2c-f497-4ddf-89ef-19b5f937965b"). InnerVolumeSpecName "kube-api-access-wxzpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.007411 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7dee2c-f497-4ddf-89ef-19b5f937965b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7d7dee2c-f497-4ddf-89ef-19b5f937965b" (UID: "7d7dee2c-f497-4ddf-89ef-19b5f937965b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.102288 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d7dee2c-f497-4ddf-89ef-19b5f937965b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.102331 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxzpv\" (UniqueName: \"kubernetes.io/projected/7d7dee2c-f497-4ddf-89ef-19b5f937965b-kube-api-access-wxzpv\") on node \"crc\" DevicePath \"\"" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.102346 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7d7dee2c-f497-4ddf-89ef-19b5f937965b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.484829 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" event={"ID":"7d7dee2c-f497-4ddf-89ef-19b5f937965b","Type":"ContainerDied","Data":"762b691a922ffc8155708896024e87448e39e656b5e5f59af6112988a34328d9"} Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.485182 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="762b691a922ffc8155708896024e87448e39e656b5e5f59af6112988a34328d9" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.484947 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v" Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.916676 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb"] Feb 01 07:30:03 crc kubenswrapper[5127]: I0201 07:30:03.920959 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498805-m62nb"] Feb 01 07:30:04 crc kubenswrapper[5127]: I0201 07:30:04.255753 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c0fb18-dc93-4aed-abd0-55631d324b99" path="/var/lib/kubelet/pods/b6c0fb18-dc93-4aed-abd0-55631d324b99/volumes" Feb 01 07:30:13 crc kubenswrapper[5127]: I0201 07:30:13.236276 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:30:13 crc kubenswrapper[5127]: E0201 07:30:13.237510 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:30:26 crc kubenswrapper[5127]: I0201 07:30:26.235958 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:30:26 crc kubenswrapper[5127]: E0201 07:30:26.237090 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:30:40 crc kubenswrapper[5127]: I0201 07:30:40.245766 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:30:40 crc kubenswrapper[5127]: E0201 07:30:40.247273 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:30:40 crc kubenswrapper[5127]: I0201 07:30:40.281005 5127 scope.go:117] "RemoveContainer" containerID="a428cffef45a764e039c470bf17d5dd774730f225654e331579ef58a51de2f36" Feb 01 07:30:52 crc kubenswrapper[5127]: I0201 07:30:52.236267 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:30:52 crc kubenswrapper[5127]: E0201 07:30:52.237155 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:31:05 crc kubenswrapper[5127]: I0201 07:31:05.235195 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:31:05 crc kubenswrapper[5127]: E0201 07:31:05.235903 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:31:19 crc kubenswrapper[5127]: I0201 07:31:19.235528 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:31:19 crc kubenswrapper[5127]: E0201 07:31:19.236909 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:31:34 crc kubenswrapper[5127]: I0201 07:31:34.236677 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:31:34 crc kubenswrapper[5127]: E0201 07:31:34.237979 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:31:47 crc kubenswrapper[5127]: I0201 07:31:47.236094 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:31:47 crc kubenswrapper[5127]: E0201 07:31:47.237117 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:31:58 crc kubenswrapper[5127]: I0201 07:31:58.235730 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:31:58 crc kubenswrapper[5127]: E0201 07:31:58.236810 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:32:09 crc kubenswrapper[5127]: I0201 07:32:09.235289 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:32:09 crc kubenswrapper[5127]: E0201 07:32:09.237048 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:32:22 crc kubenswrapper[5127]: I0201 07:32:22.235868 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:32:22 crc kubenswrapper[5127]: E0201 07:32:22.236710 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:32:36 crc kubenswrapper[5127]: I0201 07:32:36.236316 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:32:36 crc kubenswrapper[5127]: E0201 07:32:36.237499 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:32:48 crc kubenswrapper[5127]: I0201 07:32:48.235478 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:32:48 crc kubenswrapper[5127]: I0201 07:32:48.965814 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"3a63736ef3cf3cc2307709b81159ba7325080642674f71606be463daa31de062"} Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.697603 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r47ms"] Feb 01 07:33:12 crc kubenswrapper[5127]: E0201 07:33:12.698403 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7dee2c-f497-4ddf-89ef-19b5f937965b" containerName="collect-profiles" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.698415 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7dee2c-f497-4ddf-89ef-19b5f937965b" containerName="collect-profiles" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.698550 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7dee2c-f497-4ddf-89ef-19b5f937965b" containerName="collect-profiles" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.699436 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.730195 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r47ms"] Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.765255 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-utilities\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.765300 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttkw\" (UniqueName: \"kubernetes.io/projected/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-kube-api-access-2ttkw\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.765344 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-catalog-content\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.866640 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-catalog-content\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.866772 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-utilities\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.866802 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttkw\" (UniqueName: \"kubernetes.io/projected/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-kube-api-access-2ttkw\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.867692 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-utilities\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.867771 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-catalog-content\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:12 crc kubenswrapper[5127]: I0201 07:33:12.892113 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttkw\" (UniqueName: \"kubernetes.io/projected/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-kube-api-access-2ttkw\") pod \"redhat-operators-r47ms\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:13 crc kubenswrapper[5127]: I0201 07:33:13.020400 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:13 crc kubenswrapper[5127]: I0201 07:33:13.476221 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r47ms"] Feb 01 07:33:14 crc kubenswrapper[5127]: I0201 07:33:14.209611 5127 generic.go:334] "Generic (PLEG): container finished" podID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerID="beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c" exitCode=0 Feb 01 07:33:14 crc kubenswrapper[5127]: I0201 07:33:14.209666 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r47ms" event={"ID":"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e","Type":"ContainerDied","Data":"beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c"} Feb 01 07:33:14 crc kubenswrapper[5127]: I0201 07:33:14.209929 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r47ms" event={"ID":"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e","Type":"ContainerStarted","Data":"188b60d108eab7d0751dc301e0102f85a44d6068acaedf721c9b5e691d6670ed"} Feb 01 07:33:14 crc kubenswrapper[5127]: I0201 07:33:14.211118 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:33:15 crc kubenswrapper[5127]: I0201 07:33:15.219429 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r47ms" event={"ID":"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e","Type":"ContainerStarted","Data":"2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e"} Feb 01 07:33:16 crc kubenswrapper[5127]: I0201 07:33:16.230286 5127 generic.go:334] "Generic (PLEG): container finished" podID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerID="2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e" exitCode=0 Feb 01 07:33:16 crc kubenswrapper[5127]: I0201 07:33:16.231421 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r47ms" event={"ID":"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e","Type":"ContainerDied","Data":"2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e"} Feb 01 07:33:17 crc kubenswrapper[5127]: I0201 07:33:17.242914 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r47ms" event={"ID":"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e","Type":"ContainerStarted","Data":"ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f"} Feb 01 07:33:17 crc kubenswrapper[5127]: I0201 07:33:17.276901 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r47ms" podStartSLOduration=2.832077124 podStartE2EDuration="5.276878392s" podCreationTimestamp="2026-02-01 07:33:12 +0000 UTC" firstStartedPulling="2026-02-01 07:33:14.210897902 +0000 UTC m=+2744.696800265" lastFinishedPulling="2026-02-01 07:33:16.65569914 +0000 UTC m=+2747.141601533" observedRunningTime="2026-02-01 07:33:17.275195076 +0000 UTC m=+2747.761097469" watchObservedRunningTime="2026-02-01 07:33:17.276878392 +0000 UTC m=+2747.762780765" Feb 01 07:33:23 crc kubenswrapper[5127]: I0201 07:33:23.021090 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:23 crc kubenswrapper[5127]: I0201 07:33:23.021847 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:24 crc kubenswrapper[5127]: I0201 07:33:24.081682 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r47ms" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="registry-server" probeResult="failure" output=< Feb 01 07:33:24 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 07:33:24 crc kubenswrapper[5127]: > Feb 01 07:33:33 crc kubenswrapper[5127]: I0201 07:33:33.102090 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:33 crc kubenswrapper[5127]: I0201 07:33:33.185268 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:33 crc kubenswrapper[5127]: I0201 07:33:33.351978 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r47ms"] Feb 01 07:33:34 crc kubenswrapper[5127]: I0201 07:33:34.401195 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r47ms" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="registry-server" containerID="cri-o://ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f" gracePeriod=2 Feb 01 07:33:34 crc kubenswrapper[5127]: I0201 07:33:34.892605 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:34 crc kubenswrapper[5127]: I0201 07:33:34.985413 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ttkw\" (UniqueName: \"kubernetes.io/projected/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-kube-api-access-2ttkw\") pod \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " Feb 01 07:33:34 crc kubenswrapper[5127]: I0201 07:33:34.991772 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-kube-api-access-2ttkw" (OuterVolumeSpecName: "kube-api-access-2ttkw") pod "dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" (UID: "dea43d6c-484a-4bd0-a4ee-4eec1b307b4e"). InnerVolumeSpecName "kube-api-access-2ttkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.086379 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-utilities\") pod \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.086705 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-catalog-content\") pod \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\" (UID: \"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e\") " Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.087116 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ttkw\" (UniqueName: \"kubernetes.io/projected/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-kube-api-access-2ttkw\") on node \"crc\" DevicePath \"\"" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.088066 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-utilities" (OuterVolumeSpecName: "utilities") pod "dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" (UID: "dea43d6c-484a-4bd0-a4ee-4eec1b307b4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.188613 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.271495 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" (UID: "dea43d6c-484a-4bd0-a4ee-4eec1b307b4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.290561 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.415367 5127 generic.go:334] "Generic (PLEG): container finished" podID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerID="ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f" exitCode=0 Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.415429 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r47ms" event={"ID":"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e","Type":"ContainerDied","Data":"ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f"} Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.415480 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r47ms" event={"ID":"dea43d6c-484a-4bd0-a4ee-4eec1b307b4e","Type":"ContainerDied","Data":"188b60d108eab7d0751dc301e0102f85a44d6068acaedf721c9b5e691d6670ed"} Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.415520 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r47ms" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.415522 5127 scope.go:117] "RemoveContainer" containerID="ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.451397 5127 scope.go:117] "RemoveContainer" containerID="2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.467780 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r47ms"] Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.475060 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r47ms"] Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.489023 5127 scope.go:117] "RemoveContainer" containerID="beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.538245 5127 scope.go:117] "RemoveContainer" containerID="ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f" Feb 01 07:33:35 crc kubenswrapper[5127]: E0201 07:33:35.540194 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f\": container with ID starting with ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f not found: ID does not exist" containerID="ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.540249 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f"} err="failed to get container status \"ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f\": rpc error: code = NotFound desc = could not find container \"ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f\": container with ID starting with ff6bb2e63f7711f22246d1228f560d0c3f00bbdb5fd48c4cba8cd5c10cf2627f not found: ID does not exist" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.540283 5127 scope.go:117] "RemoveContainer" containerID="2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e" Feb 01 07:33:35 crc kubenswrapper[5127]: E0201 07:33:35.540694 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e\": container with ID starting with 2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e not found: ID does not exist" containerID="2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.540734 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e"} err="failed to get container status \"2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e\": rpc error: code = NotFound desc = could not find container \"2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e\": container with ID starting with 2522b41a62d57d48eee82d068c870f1258fd412025aba22121facabdf943607e not found: ID does not exist" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.540759 5127 scope.go:117] "RemoveContainer" containerID="beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c" Feb 01 07:33:35 crc kubenswrapper[5127]: E0201 07:33:35.541099 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c\": container with ID starting with beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c not found: ID does not exist" containerID="beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c" Feb 01 07:33:35 crc kubenswrapper[5127]: I0201 07:33:35.541137 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c"} err="failed to get container status \"beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c\": rpc error: code = NotFound desc = could not find container \"beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c\": container with ID starting with beba3dbfed6375b4d3eeb2a945456971c230b4df4eda06aaaa8a5018c72e007c not found: ID does not exist" Feb 01 07:33:36 crc kubenswrapper[5127]: I0201 07:33:36.248328 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" path="/var/lib/kubelet/pods/dea43d6c-484a-4bd0-a4ee-4eec1b307b4e/volumes" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.360978 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rmr5b"] Feb 01 07:34:07 crc kubenswrapper[5127]: E0201 07:34:07.362368 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="extract-utilities" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.362396 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="extract-utilities" Feb 01 07:34:07 crc kubenswrapper[5127]: E0201 07:34:07.362421 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="registry-server" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.362436 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="registry-server" Feb 01 07:34:07 crc kubenswrapper[5127]: E0201 07:34:07.362476 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="extract-content" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.362495 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="extract-content" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.362958 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea43d6c-484a-4bd0-a4ee-4eec1b307b4e" containerName="registry-server" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.365119 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.384366 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmr5b"] Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.527261 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-525zh\" (UniqueName: \"kubernetes.io/projected/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-kube-api-access-525zh\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.527749 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-catalog-content\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.527798 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-utilities\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.629614 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-catalog-content\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.629741 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-utilities\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.629788 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-525zh\" (UniqueName: \"kubernetes.io/projected/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-kube-api-access-525zh\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.631034 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-catalog-content\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.631132 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-utilities\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.663181 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-525zh\" (UniqueName: \"kubernetes.io/projected/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-kube-api-access-525zh\") pod \"redhat-marketplace-rmr5b\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:07 crc kubenswrapper[5127]: I0201 07:34:07.710267 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:08 crc kubenswrapper[5127]: I0201 07:34:08.246545 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmr5b"] Feb 01 07:34:08 crc kubenswrapper[5127]: I0201 07:34:08.724117 5127 generic.go:334] "Generic (PLEG): container finished" podID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerID="99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b" exitCode=0 Feb 01 07:34:08 crc kubenswrapper[5127]: I0201 07:34:08.724197 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmr5b" event={"ID":"dffabeee-1a3b-4199-8a94-fb6bc240b4d7","Type":"ContainerDied","Data":"99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b"} Feb 01 07:34:08 crc kubenswrapper[5127]: I0201 07:34:08.725493 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmr5b" event={"ID":"dffabeee-1a3b-4199-8a94-fb6bc240b4d7","Type":"ContainerStarted","Data":"9d9bef6a73156cd1173aa11cd03f14ec92381d88693b6bd323c11baeba1af96d"} Feb 01 07:34:09 crc kubenswrapper[5127]: I0201 07:34:09.735979 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmr5b" event={"ID":"dffabeee-1a3b-4199-8a94-fb6bc240b4d7","Type":"ContainerStarted","Data":"785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b"} Feb 01 07:34:10 crc kubenswrapper[5127]: I0201 07:34:10.747466 5127 generic.go:334] "Generic (PLEG): container finished" podID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerID="785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b" exitCode=0 Feb 01 07:34:10 crc kubenswrapper[5127]: I0201 07:34:10.747539 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmr5b" event={"ID":"dffabeee-1a3b-4199-8a94-fb6bc240b4d7","Type":"ContainerDied","Data":"785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b"} Feb 01 07:34:11 crc kubenswrapper[5127]: I0201 07:34:11.767381 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmr5b" event={"ID":"dffabeee-1a3b-4199-8a94-fb6bc240b4d7","Type":"ContainerStarted","Data":"49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1"} Feb 01 07:34:11 crc kubenswrapper[5127]: I0201 07:34:11.796970 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rmr5b" podStartSLOduration=2.356048538 podStartE2EDuration="4.796944141s" podCreationTimestamp="2026-02-01 07:34:07 +0000 UTC" firstStartedPulling="2026-02-01 07:34:08.726747578 +0000 UTC m=+2799.212649951" lastFinishedPulling="2026-02-01 07:34:11.167643151 +0000 UTC m=+2801.653545554" observedRunningTime="2026-02-01 07:34:11.788891185 +0000 UTC m=+2802.274793588" watchObservedRunningTime="2026-02-01 07:34:11.796944141 +0000 UTC m=+2802.282846544" Feb 01 07:34:17 crc kubenswrapper[5127]: I0201 07:34:17.710781 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:17 crc kubenswrapper[5127]: I0201 07:34:17.711471 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:17 crc kubenswrapper[5127]: I0201 07:34:17.773243 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:17 crc kubenswrapper[5127]: I0201 07:34:17.894659 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:18 crc kubenswrapper[5127]: I0201 07:34:18.018245 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmr5b"] Feb 01 07:34:19 crc kubenswrapper[5127]: I0201 07:34:19.844195 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rmr5b" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="registry-server" containerID="cri-o://49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1" gracePeriod=2 Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.363092 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.459535 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-utilities\") pod \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.459600 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-525zh\" (UniqueName: \"kubernetes.io/projected/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-kube-api-access-525zh\") pod \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.460503 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-utilities" (OuterVolumeSpecName: "utilities") pod "dffabeee-1a3b-4199-8a94-fb6bc240b4d7" (UID: "dffabeee-1a3b-4199-8a94-fb6bc240b4d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.468799 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-kube-api-access-525zh" (OuterVolumeSpecName: "kube-api-access-525zh") pod "dffabeee-1a3b-4199-8a94-fb6bc240b4d7" (UID: "dffabeee-1a3b-4199-8a94-fb6bc240b4d7"). InnerVolumeSpecName "kube-api-access-525zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.561383 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-catalog-content\") pod \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\" (UID: \"dffabeee-1a3b-4199-8a94-fb6bc240b4d7\") " Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.561980 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.562017 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-525zh\" (UniqueName: \"kubernetes.io/projected/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-kube-api-access-525zh\") on node \"crc\" DevicePath \"\"" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.619900 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dffabeee-1a3b-4199-8a94-fb6bc240b4d7" (UID: "dffabeee-1a3b-4199-8a94-fb6bc240b4d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.662946 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffabeee-1a3b-4199-8a94-fb6bc240b4d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.857810 5127 generic.go:334] "Generic (PLEG): container finished" podID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerID="49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1" exitCode=0 Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.857879 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmr5b" event={"ID":"dffabeee-1a3b-4199-8a94-fb6bc240b4d7","Type":"ContainerDied","Data":"49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1"} Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.857957 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmr5b" event={"ID":"dffabeee-1a3b-4199-8a94-fb6bc240b4d7","Type":"ContainerDied","Data":"9d9bef6a73156cd1173aa11cd03f14ec92381d88693b6bd323c11baeba1af96d"} Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.857992 5127 scope.go:117] "RemoveContainer" containerID="49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.857903 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmr5b" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.895878 5127 scope.go:117] "RemoveContainer" containerID="785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.921477 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmr5b"] Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.930611 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmr5b"] Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.940785 5127 scope.go:117] "RemoveContainer" containerID="99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.974153 5127 scope.go:117] "RemoveContainer" containerID="49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1" Feb 01 07:34:20 crc kubenswrapper[5127]: E0201 07:34:20.974767 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1\": container with ID starting with 49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1 not found: ID does not exist" containerID="49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.974824 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1"} err="failed to get container status \"49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1\": rpc error: code = NotFound desc = could not find container \"49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1\": container with ID starting with 49594c1ddb44417eddbde441f1dbb98fff3444e850fd46b228043d716467beb1 not found: ID does not exist" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.974861 5127 scope.go:117] "RemoveContainer" containerID="785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b" Feb 01 07:34:20 crc kubenswrapper[5127]: E0201 07:34:20.975518 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b\": container with ID starting with 785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b not found: ID does not exist" containerID="785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.975639 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b"} err="failed to get container status \"785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b\": rpc error: code = NotFound desc = could not find container \"785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b\": container with ID starting with 785c32497d07b20b41636081303014aede6f431596f9831c3d5b37eeed1d9b5b not found: ID does not exist" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.975690 5127 scope.go:117] "RemoveContainer" containerID="99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b" Feb 01 07:34:20 crc kubenswrapper[5127]: E0201 07:34:20.976383 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b\": container with ID starting with 99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b not found: ID does not exist" containerID="99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b" Feb 01 07:34:20 crc kubenswrapper[5127]: I0201 07:34:20.976452 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b"} err="failed to get container status \"99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b\": rpc error: code = NotFound desc = could not find container \"99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b\": container with ID starting with 99b920f036c2ba5eed7fdd106fd12bbf09716bed2fbd2fabea7a26782758d05b not found: ID does not exist" Feb 01 07:34:22 crc kubenswrapper[5127]: I0201 07:34:22.251275 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" path="/var/lib/kubelet/pods/dffabeee-1a3b-4199-8a94-fb6bc240b4d7/volumes" Feb 01 07:35:06 crc kubenswrapper[5127]: I0201 07:35:06.740438 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:35:06 crc kubenswrapper[5127]: I0201 07:35:06.741044 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.429817 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-khrvl"] Feb 01 07:35:25 crc kubenswrapper[5127]: E0201 07:35:25.431072 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="extract-content" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.431091 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="extract-content" Feb 01 07:35:25 crc kubenswrapper[5127]: E0201 07:35:25.431114 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="registry-server" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.431124 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="registry-server" Feb 01 07:35:25 crc kubenswrapper[5127]: E0201 07:35:25.431147 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="extract-utilities" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.431157 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="extract-utilities" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.432207 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffabeee-1a3b-4199-8a94-fb6bc240b4d7" containerName="registry-server" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.437417 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.442501 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884c63f6-518e-4e8d-9321-0e4ba5668310-catalog-content\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.442625 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkrwv\" (UniqueName: \"kubernetes.io/projected/884c63f6-518e-4e8d-9321-0e4ba5668310-kube-api-access-rkrwv\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.442676 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884c63f6-518e-4e8d-9321-0e4ba5668310-utilities\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.453068 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khrvl"] Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.544059 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884c63f6-518e-4e8d-9321-0e4ba5668310-catalog-content\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.544310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkrwv\" (UniqueName: \"kubernetes.io/projected/884c63f6-518e-4e8d-9321-0e4ba5668310-kube-api-access-rkrwv\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.544339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884c63f6-518e-4e8d-9321-0e4ba5668310-utilities\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.544829 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/884c63f6-518e-4e8d-9321-0e4ba5668310-utilities\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.544878 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/884c63f6-518e-4e8d-9321-0e4ba5668310-catalog-content\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.587652 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkrwv\" (UniqueName: \"kubernetes.io/projected/884c63f6-518e-4e8d-9321-0e4ba5668310-kube-api-access-rkrwv\") pod \"community-operators-khrvl\" (UID: \"884c63f6-518e-4e8d-9321-0e4ba5668310\") " pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:25 crc kubenswrapper[5127]: I0201 07:35:25.780401 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:26 crc kubenswrapper[5127]: I0201 07:35:26.282551 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khrvl"] Feb 01 07:35:26 crc kubenswrapper[5127]: I0201 07:35:26.554133 5127 generic.go:334] "Generic (PLEG): container finished" podID="884c63f6-518e-4e8d-9321-0e4ba5668310" containerID="ea35ff2d563759cac70c9d9ecd215e4beddf54c3738573ce5b77607923170282" exitCode=0 Feb 01 07:35:26 crc kubenswrapper[5127]: I0201 07:35:26.554182 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khrvl" event={"ID":"884c63f6-518e-4e8d-9321-0e4ba5668310","Type":"ContainerDied","Data":"ea35ff2d563759cac70c9d9ecd215e4beddf54c3738573ce5b77607923170282"} Feb 01 07:35:26 crc kubenswrapper[5127]: I0201 07:35:26.554229 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khrvl" event={"ID":"884c63f6-518e-4e8d-9321-0e4ba5668310","Type":"ContainerStarted","Data":"f1c327af706075ac9244a769575d43adcefd1f22bfb56accc25932c816d93450"} Feb 01 07:35:31 crc kubenswrapper[5127]: I0201 07:35:31.715012 5127 generic.go:334] "Generic (PLEG): container finished" podID="884c63f6-518e-4e8d-9321-0e4ba5668310" containerID="78c0942868c0b124a7d91a47b083ea9f0cd0329532696ac8acc6d2661e9cfdea" exitCode=0 Feb 01 07:35:31 crc kubenswrapper[5127]: I0201 07:35:31.715222 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khrvl" event={"ID":"884c63f6-518e-4e8d-9321-0e4ba5668310","Type":"ContainerDied","Data":"78c0942868c0b124a7d91a47b083ea9f0cd0329532696ac8acc6d2661e9cfdea"} Feb 01 07:35:32 crc kubenswrapper[5127]: I0201 07:35:32.725938 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-khrvl" event={"ID":"884c63f6-518e-4e8d-9321-0e4ba5668310","Type":"ContainerStarted","Data":"724d6e8f4d20c0901a641770306979c78cbdd0ad7263cf67ef7a2a79d5321533"} Feb 01 07:35:32 crc kubenswrapper[5127]: I0201 07:35:32.753020 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-khrvl" podStartSLOduration=2.172609639 podStartE2EDuration="7.753002853s" podCreationTimestamp="2026-02-01 07:35:25 +0000 UTC" firstStartedPulling="2026-02-01 07:35:26.556265888 +0000 UTC m=+2877.042168251" lastFinishedPulling="2026-02-01 07:35:32.136659102 +0000 UTC m=+2882.622561465" observedRunningTime="2026-02-01 07:35:32.747976817 +0000 UTC m=+2883.233879190" watchObservedRunningTime="2026-02-01 07:35:32.753002853 +0000 UTC m=+2883.238905226" Feb 01 07:35:35 crc kubenswrapper[5127]: I0201 07:35:35.781915 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:35 crc kubenswrapper[5127]: I0201 07:35:35.782523 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:35 crc kubenswrapper[5127]: I0201 07:35:35.874932 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:36 crc kubenswrapper[5127]: I0201 07:35:36.740832 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:35:36 crc kubenswrapper[5127]: I0201 07:35:36.740933 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:35:45 crc kubenswrapper[5127]: I0201 07:35:45.820782 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-khrvl" Feb 01 07:35:45 crc kubenswrapper[5127]: I0201 07:35:45.882115 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-khrvl"] Feb 01 07:35:45 crc kubenswrapper[5127]: I0201 07:35:45.912399 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5jszf"] Feb 01 07:35:45 crc kubenswrapper[5127]: I0201 07:35:45.912637 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5jszf" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="registry-server" containerID="cri-o://d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774" gracePeriod=2 Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.353756 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jszf" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.473797 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-catalog-content\") pod \"7d430573-203e-43ff-abc3-a9e81827c1d6\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.473863 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxt6r\" (UniqueName: \"kubernetes.io/projected/7d430573-203e-43ff-abc3-a9e81827c1d6-kube-api-access-wxt6r\") pod \"7d430573-203e-43ff-abc3-a9e81827c1d6\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.473903 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-utilities\") pod \"7d430573-203e-43ff-abc3-a9e81827c1d6\" (UID: \"7d430573-203e-43ff-abc3-a9e81827c1d6\") " Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.474782 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-utilities" (OuterVolumeSpecName: "utilities") pod "7d430573-203e-43ff-abc3-a9e81827c1d6" (UID: "7d430573-203e-43ff-abc3-a9e81827c1d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.489122 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d430573-203e-43ff-abc3-a9e81827c1d6-kube-api-access-wxt6r" (OuterVolumeSpecName: "kube-api-access-wxt6r") pod "7d430573-203e-43ff-abc3-a9e81827c1d6" (UID: "7d430573-203e-43ff-abc3-a9e81827c1d6"). InnerVolumeSpecName "kube-api-access-wxt6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.536652 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d430573-203e-43ff-abc3-a9e81827c1d6" (UID: "7d430573-203e-43ff-abc3-a9e81827c1d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.575659 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.575692 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d430573-203e-43ff-abc3-a9e81827c1d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.575707 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxt6r\" (UniqueName: \"kubernetes.io/projected/7d430573-203e-43ff-abc3-a9e81827c1d6-kube-api-access-wxt6r\") on node \"crc\" DevicePath \"\"" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.849047 5127 generic.go:334] "Generic (PLEG): container finished" podID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerID="d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774" exitCode=0 Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.849112 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5jszf" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.849121 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jszf" event={"ID":"7d430573-203e-43ff-abc3-a9e81827c1d6","Type":"ContainerDied","Data":"d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774"} Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.849172 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5jszf" event={"ID":"7d430573-203e-43ff-abc3-a9e81827c1d6","Type":"ContainerDied","Data":"bf938a70a30ae5a1c8dc9a49eddaf047eaa7a83843a3f01e33fe42139e43cbb1"} Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.849202 5127 scope.go:117] "RemoveContainer" containerID="d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.875137 5127 scope.go:117] "RemoveContainer" containerID="d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.890729 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5jszf"] Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.895268 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5jszf"] Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.908037 5127 scope.go:117] "RemoveContainer" containerID="cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.935764 5127 scope.go:117] "RemoveContainer" containerID="d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774" Feb 01 07:35:46 crc kubenswrapper[5127]: E0201 07:35:46.937384 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774\": container with ID starting with d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774 not found: ID does not exist" containerID="d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.937422 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774"} err="failed to get container status \"d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774\": rpc error: code = NotFound desc = could not find container \"d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774\": container with ID starting with d585dc31180abb81b8afd81d80c7e4f3f815fcee97f3bf34c88e7951d594d774 not found: ID does not exist" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.937443 5127 scope.go:117] "RemoveContainer" containerID="d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954" Feb 01 07:35:46 crc kubenswrapper[5127]: E0201 07:35:46.937936 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954\": container with ID starting with d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954 not found: ID does not exist" containerID="d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.937983 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954"} err="failed to get container status \"d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954\": rpc error: code = NotFound desc = could not find container \"d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954\": container with ID starting with d56aa0a5323dda399c521937b5e3611898d0360a0763471ab233d3cfc4fd3954 not found: ID does not exist" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.938027 5127 scope.go:117] "RemoveContainer" containerID="cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce" Feb 01 07:35:46 crc kubenswrapper[5127]: E0201 07:35:46.938332 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce\": container with ID starting with cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce not found: ID does not exist" containerID="cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce" Feb 01 07:35:46 crc kubenswrapper[5127]: I0201 07:35:46.938362 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce"} err="failed to get container status \"cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce\": rpc error: code = NotFound desc = could not find container \"cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce\": container with ID starting with cfe1ad4f3f1d262153d6c07bf4d2e6379f48c35e957f46667ac51ab3d58424ce not found: ID does not exist" Feb 01 07:35:48 crc kubenswrapper[5127]: I0201 07:35:48.250359 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" path="/var/lib/kubelet/pods/7d430573-203e-43ff-abc3-a9e81827c1d6/volumes" Feb 01 07:36:06 crc kubenswrapper[5127]: I0201 07:36:06.741080 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:36:06 crc kubenswrapper[5127]: I0201 07:36:06.741875 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:36:06 crc kubenswrapper[5127]: I0201 07:36:06.741939 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:36:06 crc kubenswrapper[5127]: I0201 07:36:06.742706 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a63736ef3cf3cc2307709b81159ba7325080642674f71606be463daa31de062"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:36:06 crc kubenswrapper[5127]: I0201 07:36:06.742838 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://3a63736ef3cf3cc2307709b81159ba7325080642674f71606be463daa31de062" gracePeriod=600 Feb 01 07:36:07 crc kubenswrapper[5127]: I0201 07:36:07.088030 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="3a63736ef3cf3cc2307709b81159ba7325080642674f71606be463daa31de062" exitCode=0 Feb 01 07:36:07 crc kubenswrapper[5127]: I0201 07:36:07.088148 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"3a63736ef3cf3cc2307709b81159ba7325080642674f71606be463daa31de062"} Feb 01 07:36:07 crc kubenswrapper[5127]: I0201 07:36:07.088328 5127 scope.go:117] "RemoveContainer" containerID="e28f4bc7d1bf5e5f7e6f65652c3f35ce73255da894c3e8a35833a0583fb72fe5" Feb 01 07:36:08 crc kubenswrapper[5127]: I0201 07:36:08.103764 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119"} Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.141070 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lbf7n"] Feb 01 07:38:32 crc kubenswrapper[5127]: E0201 07:38:32.142365 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="extract-content" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.142393 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="extract-content" Feb 01 07:38:32 crc kubenswrapper[5127]: E0201 07:38:32.142417 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="registry-server" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.142430 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="registry-server" Feb 01 07:38:32 crc kubenswrapper[5127]: E0201 07:38:32.142465 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="extract-utilities" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.142478 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="extract-utilities" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.142782 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d430573-203e-43ff-abc3-a9e81827c1d6" containerName="registry-server" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.144673 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.154495 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lbf7n"] Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.317441 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-catalog-content\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.317683 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5q5j\" (UniqueName: \"kubernetes.io/projected/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-kube-api-access-c5q5j\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.317823 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-utilities\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.418976 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-catalog-content\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.419288 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5q5j\" (UniqueName: \"kubernetes.io/projected/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-kube-api-access-c5q5j\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.419339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-utilities\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.419571 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-catalog-content\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.420356 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-utilities\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.439336 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5q5j\" (UniqueName: \"kubernetes.io/projected/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-kube-api-access-c5q5j\") pod \"certified-operators-lbf7n\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.490800 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:32 crc kubenswrapper[5127]: I0201 07:38:32.950802 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lbf7n"] Feb 01 07:38:33 crc kubenswrapper[5127]: I0201 07:38:33.611462 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerID="9058ba129c8ef5281eae4b67c86bf735cbb3cbfd75cdfbc6b6c4858092e8a735" exitCode=0 Feb 01 07:38:33 crc kubenswrapper[5127]: I0201 07:38:33.611528 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbf7n" event={"ID":"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449","Type":"ContainerDied","Data":"9058ba129c8ef5281eae4b67c86bf735cbb3cbfd75cdfbc6b6c4858092e8a735"} Feb 01 07:38:33 crc kubenswrapper[5127]: I0201 07:38:33.611593 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbf7n" event={"ID":"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449","Type":"ContainerStarted","Data":"558fbc1918417746ff16b6daf592481eeb3004f033dcc5228eb8120725120ec0"} Feb 01 07:38:33 crc kubenswrapper[5127]: I0201 07:38:33.614267 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:38:35 crc kubenswrapper[5127]: I0201 07:38:35.635569 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerID="5c6f7777144815eccd2643e5f42d04c365ae5d371c6c37b1a54cc114de3a94a4" exitCode=0 Feb 01 07:38:35 crc kubenswrapper[5127]: I0201 07:38:35.635720 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbf7n" event={"ID":"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449","Type":"ContainerDied","Data":"5c6f7777144815eccd2643e5f42d04c365ae5d371c6c37b1a54cc114de3a94a4"} Feb 01 07:38:36 crc kubenswrapper[5127]: I0201 07:38:36.741334 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:38:36 crc kubenswrapper[5127]: I0201 07:38:36.741942 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:38:37 crc kubenswrapper[5127]: I0201 07:38:37.656068 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbf7n" event={"ID":"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449","Type":"ContainerStarted","Data":"d63f6baff7c945d5a11000cd78629e7a53e970ecb7ed1722031a3bc305dac0d6"} Feb 01 07:38:37 crc kubenswrapper[5127]: I0201 07:38:37.677764 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lbf7n" podStartSLOduration=2.767158996 podStartE2EDuration="5.677745727s" podCreationTimestamp="2026-02-01 07:38:32 +0000 UTC" firstStartedPulling="2026-02-01 07:38:33.613894896 +0000 UTC m=+3064.099797269" lastFinishedPulling="2026-02-01 07:38:36.524481597 +0000 UTC m=+3067.010384000" observedRunningTime="2026-02-01 07:38:37.67485421 +0000 UTC m=+3068.160756613" watchObservedRunningTime="2026-02-01 07:38:37.677745727 +0000 UTC m=+3068.163648090" Feb 01 07:38:42 crc kubenswrapper[5127]: I0201 07:38:42.490950 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:42 crc kubenswrapper[5127]: I0201 07:38:42.491017 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:42 crc kubenswrapper[5127]: I0201 07:38:42.553039 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:42 crc kubenswrapper[5127]: I0201 07:38:42.783846 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:43 crc kubenswrapper[5127]: I0201 07:38:43.722309 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lbf7n"] Feb 01 07:38:44 crc kubenswrapper[5127]: I0201 07:38:44.729430 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lbf7n" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="registry-server" containerID="cri-o://d63f6baff7c945d5a11000cd78629e7a53e970ecb7ed1722031a3bc305dac0d6" gracePeriod=2 Feb 01 07:38:45 crc kubenswrapper[5127]: I0201 07:38:45.742864 5127 generic.go:334] "Generic (PLEG): container finished" podID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerID="d63f6baff7c945d5a11000cd78629e7a53e970ecb7ed1722031a3bc305dac0d6" exitCode=0 Feb 01 07:38:45 crc kubenswrapper[5127]: I0201 07:38:45.742917 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbf7n" event={"ID":"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449","Type":"ContainerDied","Data":"d63f6baff7c945d5a11000cd78629e7a53e970ecb7ed1722031a3bc305dac0d6"} Feb 01 07:38:45 crc kubenswrapper[5127]: I0201 07:38:45.855487 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.056103 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-catalog-content\") pod \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.056267 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5q5j\" (UniqueName: \"kubernetes.io/projected/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-kube-api-access-c5q5j\") pod \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.056344 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-utilities\") pod \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\" (UID: \"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449\") " Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.058162 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-utilities" (OuterVolumeSpecName: "utilities") pod "1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" (UID: "1c8d9195-0eb7-46fd-8b7a-a23ca04e5449"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.071136 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-kube-api-access-c5q5j" (OuterVolumeSpecName: "kube-api-access-c5q5j") pod "1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" (UID: "1c8d9195-0eb7-46fd-8b7a-a23ca04e5449"). InnerVolumeSpecName "kube-api-access-c5q5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.158889 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5q5j\" (UniqueName: \"kubernetes.io/projected/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-kube-api-access-c5q5j\") on node \"crc\" DevicePath \"\"" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.159004 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.757400 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbf7n" event={"ID":"1c8d9195-0eb7-46fd-8b7a-a23ca04e5449","Type":"ContainerDied","Data":"558fbc1918417746ff16b6daf592481eeb3004f033dcc5228eb8120725120ec0"} Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.757477 5127 scope.go:117] "RemoveContainer" containerID="d63f6baff7c945d5a11000cd78629e7a53e970ecb7ed1722031a3bc305dac0d6" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.757772 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbf7n" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.791552 5127 scope.go:117] "RemoveContainer" containerID="5c6f7777144815eccd2643e5f42d04c365ae5d371c6c37b1a54cc114de3a94a4" Feb 01 07:38:46 crc kubenswrapper[5127]: I0201 07:38:46.817275 5127 scope.go:117] "RemoveContainer" containerID="9058ba129c8ef5281eae4b67c86bf735cbb3cbfd75cdfbc6b6c4858092e8a735" Feb 01 07:38:47 crc kubenswrapper[5127]: I0201 07:38:47.234242 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" (UID: "1c8d9195-0eb7-46fd-8b7a-a23ca04e5449"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:38:47 crc kubenswrapper[5127]: I0201 07:38:47.278305 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:38:47 crc kubenswrapper[5127]: I0201 07:38:47.400399 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lbf7n"] Feb 01 07:38:47 crc kubenswrapper[5127]: I0201 07:38:47.409655 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lbf7n"] Feb 01 07:38:48 crc kubenswrapper[5127]: I0201 07:38:48.243847 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" path="/var/lib/kubelet/pods/1c8d9195-0eb7-46fd-8b7a-a23ca04e5449/volumes" Feb 01 07:39:06 crc kubenswrapper[5127]: I0201 07:39:06.740682 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:39:06 crc kubenswrapper[5127]: I0201 07:39:06.741360 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:39:36 crc kubenswrapper[5127]: I0201 07:39:36.740647 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:39:36 crc kubenswrapper[5127]: I0201 07:39:36.741650 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:39:36 crc kubenswrapper[5127]: I0201 07:39:36.741724 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:39:36 crc kubenswrapper[5127]: I0201 07:39:36.742563 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:39:36 crc kubenswrapper[5127]: I0201 07:39:36.742702 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" gracePeriod=600 Feb 01 07:39:36 crc kubenswrapper[5127]: E0201 07:39:36.872098 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:39:37 crc kubenswrapper[5127]: I0201 07:39:37.277945 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" exitCode=0 Feb 01 07:39:37 crc kubenswrapper[5127]: I0201 07:39:37.278057 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119"} Feb 01 07:39:37 crc kubenswrapper[5127]: I0201 07:39:37.278741 5127 scope.go:117] "RemoveContainer" containerID="3a63736ef3cf3cc2307709b81159ba7325080642674f71606be463daa31de062" Feb 01 07:39:37 crc kubenswrapper[5127]: I0201 07:39:37.279549 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:39:37 crc kubenswrapper[5127]: E0201 07:39:37.280087 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:39:51 crc kubenswrapper[5127]: I0201 07:39:51.236089 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:39:51 crc kubenswrapper[5127]: E0201 07:39:51.237036 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:40:05 crc kubenswrapper[5127]: I0201 07:40:05.236216 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:40:05 crc kubenswrapper[5127]: E0201 07:40:05.237530 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:40:17 crc kubenswrapper[5127]: I0201 07:40:17.235874 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:40:17 crc kubenswrapper[5127]: E0201 07:40:17.236516 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:40:29 crc kubenswrapper[5127]: I0201 07:40:29.236424 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:40:29 crc kubenswrapper[5127]: E0201 07:40:29.237371 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:40:42 crc kubenswrapper[5127]: I0201 07:40:42.236128 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:40:42 crc kubenswrapper[5127]: E0201 07:40:42.237452 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:40:57 crc kubenswrapper[5127]: I0201 07:40:57.236300 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:40:57 crc kubenswrapper[5127]: E0201 07:40:57.237849 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:41:11 crc kubenswrapper[5127]: I0201 07:41:11.236093 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:41:11 crc kubenswrapper[5127]: E0201 07:41:11.237211 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:41:22 crc kubenswrapper[5127]: I0201 07:41:22.236044 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:41:22 crc kubenswrapper[5127]: E0201 07:41:22.237134 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:41:37 crc kubenswrapper[5127]: I0201 07:41:37.235914 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:41:37 crc kubenswrapper[5127]: E0201 07:41:37.237399 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:41:50 crc kubenswrapper[5127]: I0201 07:41:50.245675 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:41:50 crc kubenswrapper[5127]: E0201 07:41:50.246957 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:42:02 crc kubenswrapper[5127]: I0201 07:42:02.235668 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:42:02 crc kubenswrapper[5127]: E0201 07:42:02.236703 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:42:15 crc kubenswrapper[5127]: I0201 07:42:15.235734 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:42:15 crc kubenswrapper[5127]: E0201 07:42:15.236664 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:42:26 crc kubenswrapper[5127]: I0201 07:42:26.236431 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:42:26 crc kubenswrapper[5127]: E0201 07:42:26.237799 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:42:41 crc kubenswrapper[5127]: I0201 07:42:41.235914 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:42:41 crc kubenswrapper[5127]: E0201 07:42:41.237226 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:42:54 crc kubenswrapper[5127]: I0201 07:42:54.235784 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:42:54 crc kubenswrapper[5127]: E0201 07:42:54.238960 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:43:09 crc kubenswrapper[5127]: I0201 07:43:09.236141 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:43:09 crc kubenswrapper[5127]: E0201 07:43:09.237247 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:43:23 crc kubenswrapper[5127]: I0201 07:43:23.236113 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:43:23 crc kubenswrapper[5127]: E0201 07:43:23.236956 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.235393 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:43:36 crc kubenswrapper[5127]: E0201 07:43:36.236445 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.524143 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qrp25"] Feb 01 07:43:36 crc kubenswrapper[5127]: E0201 07:43:36.524920 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="extract-content" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.524952 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="extract-content" Feb 01 07:43:36 crc kubenswrapper[5127]: E0201 07:43:36.524980 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="extract-utilities" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.524995 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="extract-utilities" Feb 01 07:43:36 crc kubenswrapper[5127]: E0201 07:43:36.525024 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="registry-server" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.525039 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="registry-server" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.528488 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8d9195-0eb7-46fd-8b7a-a23ca04e5449" containerName="registry-server" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.531326 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.547806 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrp25"] Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.712723 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d27qb\" (UniqueName: \"kubernetes.io/projected/53802df2-4065-4226-a7ee-f08f594d0c9f-kube-api-access-d27qb\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.712817 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-utilities\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.712886 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-catalog-content\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.813711 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d27qb\" (UniqueName: \"kubernetes.io/projected/53802df2-4065-4226-a7ee-f08f594d0c9f-kube-api-access-d27qb\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.814075 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-utilities\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.814510 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-utilities\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.814551 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-catalog-content\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.814942 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-catalog-content\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.847749 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d27qb\" (UniqueName: \"kubernetes.io/projected/53802df2-4065-4226-a7ee-f08f594d0c9f-kube-api-access-d27qb\") pod \"redhat-operators-qrp25\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:36 crc kubenswrapper[5127]: I0201 07:43:36.900022 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:37 crc kubenswrapper[5127]: I0201 07:43:37.356833 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qrp25"] Feb 01 07:43:38 crc kubenswrapper[5127]: I0201 07:43:38.232808 5127 generic.go:334] "Generic (PLEG): container finished" podID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerID="e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90" exitCode=0 Feb 01 07:43:38 crc kubenswrapper[5127]: I0201 07:43:38.232861 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrp25" event={"ID":"53802df2-4065-4226-a7ee-f08f594d0c9f","Type":"ContainerDied","Data":"e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90"} Feb 01 07:43:38 crc kubenswrapper[5127]: I0201 07:43:38.233133 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrp25" event={"ID":"53802df2-4065-4226-a7ee-f08f594d0c9f","Type":"ContainerStarted","Data":"70bbeeccb50facf5196eefd4e2c28caeaa15bbe117c955443dc731d21e235821"} Feb 01 07:43:38 crc kubenswrapper[5127]: I0201 07:43:38.234952 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:43:39 crc kubenswrapper[5127]: I0201 07:43:39.244324 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrp25" event={"ID":"53802df2-4065-4226-a7ee-f08f594d0c9f","Type":"ContainerStarted","Data":"f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d"} Feb 01 07:43:40 crc kubenswrapper[5127]: I0201 07:43:40.255193 5127 generic.go:334] "Generic (PLEG): container finished" podID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerID="f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d" exitCode=0 Feb 01 07:43:40 crc kubenswrapper[5127]: I0201 07:43:40.255244 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrp25" event={"ID":"53802df2-4065-4226-a7ee-f08f594d0c9f","Type":"ContainerDied","Data":"f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d"} Feb 01 07:43:41 crc kubenswrapper[5127]: I0201 07:43:41.269087 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrp25" event={"ID":"53802df2-4065-4226-a7ee-f08f594d0c9f","Type":"ContainerStarted","Data":"08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2"} Feb 01 07:43:41 crc kubenswrapper[5127]: I0201 07:43:41.302697 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qrp25" podStartSLOduration=2.724144045 podStartE2EDuration="5.302664569s" podCreationTimestamp="2026-02-01 07:43:36 +0000 UTC" firstStartedPulling="2026-02-01 07:43:38.234749578 +0000 UTC m=+3368.720651941" lastFinishedPulling="2026-02-01 07:43:40.813270072 +0000 UTC m=+3371.299172465" observedRunningTime="2026-02-01 07:43:41.299356032 +0000 UTC m=+3371.785258455" watchObservedRunningTime="2026-02-01 07:43:41.302664569 +0000 UTC m=+3371.788566982" Feb 01 07:43:46 crc kubenswrapper[5127]: I0201 07:43:46.900610 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:46 crc kubenswrapper[5127]: I0201 07:43:46.901486 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:47 crc kubenswrapper[5127]: I0201 07:43:47.955205 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qrp25" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="registry-server" probeResult="failure" output=< Feb 01 07:43:47 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 07:43:47 crc kubenswrapper[5127]: > Feb 01 07:43:51 crc kubenswrapper[5127]: I0201 07:43:51.235947 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:43:51 crc kubenswrapper[5127]: E0201 07:43:51.238220 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:43:56 crc kubenswrapper[5127]: I0201 07:43:56.969760 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:57 crc kubenswrapper[5127]: I0201 07:43:57.051778 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:57 crc kubenswrapper[5127]: I0201 07:43:57.230114 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrp25"] Feb 01 07:43:58 crc kubenswrapper[5127]: I0201 07:43:58.417349 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qrp25" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="registry-server" containerID="cri-o://08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2" gracePeriod=2 Feb 01 07:43:58 crc kubenswrapper[5127]: I0201 07:43:58.949797 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.069785 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-catalog-content\") pod \"53802df2-4065-4226-a7ee-f08f594d0c9f\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.069839 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d27qb\" (UniqueName: \"kubernetes.io/projected/53802df2-4065-4226-a7ee-f08f594d0c9f-kube-api-access-d27qb\") pod \"53802df2-4065-4226-a7ee-f08f594d0c9f\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.069868 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-utilities\") pod \"53802df2-4065-4226-a7ee-f08f594d0c9f\" (UID: \"53802df2-4065-4226-a7ee-f08f594d0c9f\") " Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.070933 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-utilities" (OuterVolumeSpecName: "utilities") pod "53802df2-4065-4226-a7ee-f08f594d0c9f" (UID: "53802df2-4065-4226-a7ee-f08f594d0c9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.074845 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53802df2-4065-4226-a7ee-f08f594d0c9f-kube-api-access-d27qb" (OuterVolumeSpecName: "kube-api-access-d27qb") pod "53802df2-4065-4226-a7ee-f08f594d0c9f" (UID: "53802df2-4065-4226-a7ee-f08f594d0c9f"). InnerVolumeSpecName "kube-api-access-d27qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.171118 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d27qb\" (UniqueName: \"kubernetes.io/projected/53802df2-4065-4226-a7ee-f08f594d0c9f-kube-api-access-d27qb\") on node \"crc\" DevicePath \"\"" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.171162 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.208922 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53802df2-4065-4226-a7ee-f08f594d0c9f" (UID: "53802df2-4065-4226-a7ee-f08f594d0c9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.272749 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53802df2-4065-4226-a7ee-f08f594d0c9f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.434616 5127 generic.go:334] "Generic (PLEG): container finished" podID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerID="08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2" exitCode=0 Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.434716 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qrp25" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.434709 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrp25" event={"ID":"53802df2-4065-4226-a7ee-f08f594d0c9f","Type":"ContainerDied","Data":"08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2"} Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.434920 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qrp25" event={"ID":"53802df2-4065-4226-a7ee-f08f594d0c9f","Type":"ContainerDied","Data":"70bbeeccb50facf5196eefd4e2c28caeaa15bbe117c955443dc731d21e235821"} Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.434965 5127 scope.go:117] "RemoveContainer" containerID="08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.497400 5127 scope.go:117] "RemoveContainer" containerID="f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.503841 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qrp25"] Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.515917 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qrp25"] Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.537461 5127 scope.go:117] "RemoveContainer" containerID="e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.581103 5127 scope.go:117] "RemoveContainer" containerID="08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2" Feb 01 07:43:59 crc kubenswrapper[5127]: E0201 07:43:59.582342 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2\": container with ID starting with 08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2 not found: ID does not exist" containerID="08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.582470 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2"} err="failed to get container status \"08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2\": rpc error: code = NotFound desc = could not find container \"08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2\": container with ID starting with 08b7122a7a27b80a3e960e3708d86d548384cf98fe02f9d4ea0cb0266487dcb2 not found: ID does not exist" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.582546 5127 scope.go:117] "RemoveContainer" containerID="f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d" Feb 01 07:43:59 crc kubenswrapper[5127]: E0201 07:43:59.583606 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d\": container with ID starting with f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d not found: ID does not exist" containerID="f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.583637 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d"} err="failed to get container status \"f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d\": rpc error: code = NotFound desc = could not find container \"f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d\": container with ID starting with f342cf39f0b52d446ff3b09cce27661ca052ab60ab9ee18a1c0c67d1bb62ed5d not found: ID does not exist" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.583659 5127 scope.go:117] "RemoveContainer" containerID="e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90" Feb 01 07:43:59 crc kubenswrapper[5127]: E0201 07:43:59.583965 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90\": container with ID starting with e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90 not found: ID does not exist" containerID="e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90" Feb 01 07:43:59 crc kubenswrapper[5127]: I0201 07:43:59.584011 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90"} err="failed to get container status \"e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90\": rpc error: code = NotFound desc = could not find container \"e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90\": container with ID starting with e2cf4a56fad81217165298f5f1ee5890674bf5071557ef875c3e59c7acc28a90 not found: ID does not exist" Feb 01 07:44:00 crc kubenswrapper[5127]: I0201 07:44:00.259492 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" path="/var/lib/kubelet/pods/53802df2-4065-4226-a7ee-f08f594d0c9f/volumes" Feb 01 07:44:03 crc kubenswrapper[5127]: I0201 07:44:03.236157 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:44:03 crc kubenswrapper[5127]: E0201 07:44:03.236847 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:44:17 crc kubenswrapper[5127]: I0201 07:44:17.236045 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:44:17 crc kubenswrapper[5127]: E0201 07:44:17.237167 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.637883 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zz4r4"] Feb 01 07:44:21 crc kubenswrapper[5127]: E0201 07:44:21.638496 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="extract-utilities" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.638509 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="extract-utilities" Feb 01 07:44:21 crc kubenswrapper[5127]: E0201 07:44:21.638535 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="extract-content" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.638542 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="extract-content" Feb 01 07:44:21 crc kubenswrapper[5127]: E0201 07:44:21.638550 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="registry-server" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.638556 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="registry-server" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.639925 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="53802df2-4065-4226-a7ee-f08f594d0c9f" containerName="registry-server" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.642865 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.686369 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz4r4"] Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.772448 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-catalog-content\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.772513 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pxs9\" (UniqueName: \"kubernetes.io/projected/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-kube-api-access-6pxs9\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.772541 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-utilities\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.874660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-catalog-content\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.874754 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pxs9\" (UniqueName: \"kubernetes.io/projected/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-kube-api-access-6pxs9\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.874790 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-utilities\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.875144 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-catalog-content\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.875187 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-utilities\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.896204 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pxs9\" (UniqueName: \"kubernetes.io/projected/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-kube-api-access-6pxs9\") pod \"redhat-marketplace-zz4r4\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:21 crc kubenswrapper[5127]: I0201 07:44:21.978349 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:22 crc kubenswrapper[5127]: I0201 07:44:22.509466 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz4r4"] Feb 01 07:44:22 crc kubenswrapper[5127]: I0201 07:44:22.667786 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz4r4" event={"ID":"26ff2cac-dc39-4ec0-9c5d-015ee60ea889","Type":"ContainerStarted","Data":"a21e8c571f5c8d0fd085f4848c01c11f038f737e1cc745deec36c009d1d10a39"} Feb 01 07:44:23 crc kubenswrapper[5127]: I0201 07:44:23.678153 5127 generic.go:334] "Generic (PLEG): container finished" podID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerID="41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717" exitCode=0 Feb 01 07:44:23 crc kubenswrapper[5127]: I0201 07:44:23.678226 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz4r4" event={"ID":"26ff2cac-dc39-4ec0-9c5d-015ee60ea889","Type":"ContainerDied","Data":"41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717"} Feb 01 07:44:24 crc kubenswrapper[5127]: I0201 07:44:24.687657 5127 generic.go:334] "Generic (PLEG): container finished" podID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerID="1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327" exitCode=0 Feb 01 07:44:24 crc kubenswrapper[5127]: I0201 07:44:24.687815 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz4r4" event={"ID":"26ff2cac-dc39-4ec0-9c5d-015ee60ea889","Type":"ContainerDied","Data":"1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327"} Feb 01 07:44:25 crc kubenswrapper[5127]: I0201 07:44:25.696562 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz4r4" event={"ID":"26ff2cac-dc39-4ec0-9c5d-015ee60ea889","Type":"ContainerStarted","Data":"e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec"} Feb 01 07:44:25 crc kubenswrapper[5127]: I0201 07:44:25.725574 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zz4r4" podStartSLOduration=2.970427457 podStartE2EDuration="4.725556848s" podCreationTimestamp="2026-02-01 07:44:21 +0000 UTC" firstStartedPulling="2026-02-01 07:44:23.680175441 +0000 UTC m=+3414.166077834" lastFinishedPulling="2026-02-01 07:44:25.435304822 +0000 UTC m=+3415.921207225" observedRunningTime="2026-02-01 07:44:25.720017791 +0000 UTC m=+3416.205920154" watchObservedRunningTime="2026-02-01 07:44:25.725556848 +0000 UTC m=+3416.211459211" Feb 01 07:44:31 crc kubenswrapper[5127]: I0201 07:44:31.979402 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:31 crc kubenswrapper[5127]: I0201 07:44:31.979960 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:32 crc kubenswrapper[5127]: I0201 07:44:32.036960 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:32 crc kubenswrapper[5127]: I0201 07:44:32.235810 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:44:32 crc kubenswrapper[5127]: E0201 07:44:32.236070 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:44:32 crc kubenswrapper[5127]: I0201 07:44:32.815693 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:32 crc kubenswrapper[5127]: I0201 07:44:32.879822 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz4r4"] Feb 01 07:44:34 crc kubenswrapper[5127]: I0201 07:44:34.774646 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zz4r4" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="registry-server" containerID="cri-o://e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec" gracePeriod=2 Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.266915 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.286614 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pxs9\" (UniqueName: \"kubernetes.io/projected/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-kube-api-access-6pxs9\") pod \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.286679 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-utilities\") pod \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.286699 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-catalog-content\") pod \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\" (UID: \"26ff2cac-dc39-4ec0-9c5d-015ee60ea889\") " Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.288480 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-utilities" (OuterVolumeSpecName: "utilities") pod "26ff2cac-dc39-4ec0-9c5d-015ee60ea889" (UID: "26ff2cac-dc39-4ec0-9c5d-015ee60ea889"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.295776 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-kube-api-access-6pxs9" (OuterVolumeSpecName: "kube-api-access-6pxs9") pod "26ff2cac-dc39-4ec0-9c5d-015ee60ea889" (UID: "26ff2cac-dc39-4ec0-9c5d-015ee60ea889"). InnerVolumeSpecName "kube-api-access-6pxs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.322225 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26ff2cac-dc39-4ec0-9c5d-015ee60ea889" (UID: "26ff2cac-dc39-4ec0-9c5d-015ee60ea889"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.388208 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pxs9\" (UniqueName: \"kubernetes.io/projected/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-kube-api-access-6pxs9\") on node \"crc\" DevicePath \"\"" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.388503 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.388632 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26ff2cac-dc39-4ec0-9c5d-015ee60ea889-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.783208 5127 generic.go:334] "Generic (PLEG): container finished" podID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerID="e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec" exitCode=0 Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.783287 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zz4r4" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.784145 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz4r4" event={"ID":"26ff2cac-dc39-4ec0-9c5d-015ee60ea889","Type":"ContainerDied","Data":"e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec"} Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.784261 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zz4r4" event={"ID":"26ff2cac-dc39-4ec0-9c5d-015ee60ea889","Type":"ContainerDied","Data":"a21e8c571f5c8d0fd085f4848c01c11f038f737e1cc745deec36c009d1d10a39"} Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.784290 5127 scope.go:117] "RemoveContainer" containerID="e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.798620 5127 scope.go:117] "RemoveContainer" containerID="1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.812527 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz4r4"] Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.824417 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zz4r4"] Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.841962 5127 scope.go:117] "RemoveContainer" containerID="41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.856432 5127 scope.go:117] "RemoveContainer" containerID="e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec" Feb 01 07:44:35 crc kubenswrapper[5127]: E0201 07:44:35.856934 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec\": container with ID starting with e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec not found: ID does not exist" containerID="e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.856975 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec"} err="failed to get container status \"e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec\": rpc error: code = NotFound desc = could not find container \"e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec\": container with ID starting with e37fb75a6edf399910da7bebd2f33b5da83f44af08daed450ffba4648ee8b7ec not found: ID does not exist" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.857006 5127 scope.go:117] "RemoveContainer" containerID="1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327" Feb 01 07:44:35 crc kubenswrapper[5127]: E0201 07:44:35.857388 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327\": container with ID starting with 1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327 not found: ID does not exist" containerID="1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.857511 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327"} err="failed to get container status \"1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327\": rpc error: code = NotFound desc = could not find container \"1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327\": container with ID starting with 1e2c925dfb7171c9e5d9cb9a54f5e14351d2bbae1798dda039b790d5f0412327 not found: ID does not exist" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.857634 5127 scope.go:117] "RemoveContainer" containerID="41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717" Feb 01 07:44:35 crc kubenswrapper[5127]: E0201 07:44:35.858041 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717\": container with ID starting with 41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717 not found: ID does not exist" containerID="41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717" Feb 01 07:44:35 crc kubenswrapper[5127]: I0201 07:44:35.858066 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717"} err="failed to get container status \"41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717\": rpc error: code = NotFound desc = could not find container \"41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717\": container with ID starting with 41aba85f4f198760b605c555edf1b915d8ec64bc2ea3d79bb1751f3ac7faf717 not found: ID does not exist" Feb 01 07:44:36 crc kubenswrapper[5127]: I0201 07:44:36.249832 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" path="/var/lib/kubelet/pods/26ff2cac-dc39-4ec0-9c5d-015ee60ea889/volumes" Feb 01 07:44:45 crc kubenswrapper[5127]: I0201 07:44:45.236128 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:44:45 crc kubenswrapper[5127]: I0201 07:44:45.877565 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"9f2b7c60b4e95a6c9c01f0d4b0bddc2d95ecb74ead5fd68ec5315991b2f7d4c2"} Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.186923 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn"] Feb 01 07:45:00 crc kubenswrapper[5127]: E0201 07:45:00.193754 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="registry-server" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.193874 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="registry-server" Feb 01 07:45:00 crc kubenswrapper[5127]: E0201 07:45:00.193991 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="extract-utilities" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.196801 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="extract-utilities" Feb 01 07:45:00 crc kubenswrapper[5127]: E0201 07:45:00.196973 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="extract-content" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.197115 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="extract-content" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.197624 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="26ff2cac-dc39-4ec0-9c5d-015ee60ea889" containerName="registry-server" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.199402 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.205411 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.206042 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.224820 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn"] Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.286238 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gh67\" (UniqueName: \"kubernetes.io/projected/8335c315-d03b-495f-99c6-26d8bf68938a-kube-api-access-7gh67\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.286322 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8335c315-d03b-495f-99c6-26d8bf68938a-config-volume\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.286506 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8335c315-d03b-495f-99c6-26d8bf68938a-secret-volume\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.387308 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8335c315-d03b-495f-99c6-26d8bf68938a-secret-volume\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.387374 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gh67\" (UniqueName: \"kubernetes.io/projected/8335c315-d03b-495f-99c6-26d8bf68938a-kube-api-access-7gh67\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.387401 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8335c315-d03b-495f-99c6-26d8bf68938a-config-volume\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.388542 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8335c315-d03b-495f-99c6-26d8bf68938a-config-volume\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.396455 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8335c315-d03b-495f-99c6-26d8bf68938a-secret-volume\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.406867 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gh67\" (UniqueName: \"kubernetes.io/projected/8335c315-d03b-495f-99c6-26d8bf68938a-kube-api-access-7gh67\") pod \"collect-profiles-29498865-4pksn\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:00 crc kubenswrapper[5127]: I0201 07:45:00.541638 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:01 crc kubenswrapper[5127]: I0201 07:45:01.031632 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn"] Feb 01 07:45:01 crc kubenswrapper[5127]: W0201 07:45:01.047445 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8335c315_d03b_495f_99c6_26d8bf68938a.slice/crio-8930d0447b273f3dadc74555b60c79c436c22822c7087ccdbeeaa8961841e18d WatchSource:0}: Error finding container 8930d0447b273f3dadc74555b60c79c436c22822c7087ccdbeeaa8961841e18d: Status 404 returned error can't find the container with id 8930d0447b273f3dadc74555b60c79c436c22822c7087ccdbeeaa8961841e18d Feb 01 07:45:02 crc kubenswrapper[5127]: I0201 07:45:02.027467 5127 generic.go:334] "Generic (PLEG): container finished" podID="8335c315-d03b-495f-99c6-26d8bf68938a" containerID="e1c6cba150a023d7c39a6c73b619f811ea3c6900f8f91ebe0fe28f34e682b4dc" exitCode=0 Feb 01 07:45:02 crc kubenswrapper[5127]: I0201 07:45:02.027601 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" event={"ID":"8335c315-d03b-495f-99c6-26d8bf68938a","Type":"ContainerDied","Data":"e1c6cba150a023d7c39a6c73b619f811ea3c6900f8f91ebe0fe28f34e682b4dc"} Feb 01 07:45:02 crc kubenswrapper[5127]: I0201 07:45:02.027714 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" event={"ID":"8335c315-d03b-495f-99c6-26d8bf68938a","Type":"ContainerStarted","Data":"8930d0447b273f3dadc74555b60c79c436c22822c7087ccdbeeaa8961841e18d"} Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.370054 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.531958 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8335c315-d03b-495f-99c6-26d8bf68938a-secret-volume\") pod \"8335c315-d03b-495f-99c6-26d8bf68938a\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.532049 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gh67\" (UniqueName: \"kubernetes.io/projected/8335c315-d03b-495f-99c6-26d8bf68938a-kube-api-access-7gh67\") pod \"8335c315-d03b-495f-99c6-26d8bf68938a\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.532100 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8335c315-d03b-495f-99c6-26d8bf68938a-config-volume\") pod \"8335c315-d03b-495f-99c6-26d8bf68938a\" (UID: \"8335c315-d03b-495f-99c6-26d8bf68938a\") " Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.533308 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8335c315-d03b-495f-99c6-26d8bf68938a-config-volume" (OuterVolumeSpecName: "config-volume") pod "8335c315-d03b-495f-99c6-26d8bf68938a" (UID: "8335c315-d03b-495f-99c6-26d8bf68938a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.539440 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8335c315-d03b-495f-99c6-26d8bf68938a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8335c315-d03b-495f-99c6-26d8bf68938a" (UID: "8335c315-d03b-495f-99c6-26d8bf68938a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.540995 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8335c315-d03b-495f-99c6-26d8bf68938a-kube-api-access-7gh67" (OuterVolumeSpecName: "kube-api-access-7gh67") pod "8335c315-d03b-495f-99c6-26d8bf68938a" (UID: "8335c315-d03b-495f-99c6-26d8bf68938a"). InnerVolumeSpecName "kube-api-access-7gh67". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.634997 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8335c315-d03b-495f-99c6-26d8bf68938a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.635090 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gh67\" (UniqueName: \"kubernetes.io/projected/8335c315-d03b-495f-99c6-26d8bf68938a-kube-api-access-7gh67\") on node \"crc\" DevicePath \"\"" Feb 01 07:45:03 crc kubenswrapper[5127]: I0201 07:45:03.635114 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8335c315-d03b-495f-99c6-26d8bf68938a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 07:45:04 crc kubenswrapper[5127]: I0201 07:45:04.048973 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" event={"ID":"8335c315-d03b-495f-99c6-26d8bf68938a","Type":"ContainerDied","Data":"8930d0447b273f3dadc74555b60c79c436c22822c7087ccdbeeaa8961841e18d"} Feb 01 07:45:04 crc kubenswrapper[5127]: I0201 07:45:04.049027 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8930d0447b273f3dadc74555b60c79c436c22822c7087ccdbeeaa8961841e18d" Feb 01 07:45:04 crc kubenswrapper[5127]: I0201 07:45:04.049077 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn" Feb 01 07:45:04 crc kubenswrapper[5127]: I0201 07:45:04.475397 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662"] Feb 01 07:45:04 crc kubenswrapper[5127]: I0201 07:45:04.489525 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498820-2b662"] Feb 01 07:45:06 crc kubenswrapper[5127]: I0201 07:45:06.245229 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a20b3259-17e4-4994-85fb-efd8f4cb4aa5" path="/var/lib/kubelet/pods/a20b3259-17e4-4994-85fb-efd8f4cb4aa5/volumes" Feb 01 07:45:40 crc kubenswrapper[5127]: I0201 07:45:40.711202 5127 scope.go:117] "RemoveContainer" containerID="db80a2538217fba326d03439d56791ae70b8dc5fe90d51f0f4aec2b075f86734" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.146847 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dwtvl"] Feb 01 07:46:11 crc kubenswrapper[5127]: E0201 07:46:11.147803 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8335c315-d03b-495f-99c6-26d8bf68938a" containerName="collect-profiles" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.147821 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8335c315-d03b-495f-99c6-26d8bf68938a" containerName="collect-profiles" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.148036 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8335c315-d03b-495f-99c6-26d8bf68938a" containerName="collect-profiles" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.149196 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.170477 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwtvl"] Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.249828 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-utilities\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.249888 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv85q\" (UniqueName: \"kubernetes.io/projected/b159f979-52d8-49cc-aa99-1fd8395da992-kube-api-access-rv85q\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.249973 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-catalog-content\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.351901 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-catalog-content\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.352071 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-utilities\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.352120 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv85q\" (UniqueName: \"kubernetes.io/projected/b159f979-52d8-49cc-aa99-1fd8395da992-kube-api-access-rv85q\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.352524 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-catalog-content\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.353051 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-utilities\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.377337 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv85q\" (UniqueName: \"kubernetes.io/projected/b159f979-52d8-49cc-aa99-1fd8395da992-kube-api-access-rv85q\") pod \"community-operators-dwtvl\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.494646 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:11 crc kubenswrapper[5127]: I0201 07:46:11.986677 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dwtvl"] Feb 01 07:46:11 crc kubenswrapper[5127]: W0201 07:46:11.997198 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb159f979_52d8_49cc_aa99_1fd8395da992.slice/crio-e08b53a504d73ceab5d52f604bd765648f620ece0f1c7363b16218c599e5fa30 WatchSource:0}: Error finding container e08b53a504d73ceab5d52f604bd765648f620ece0f1c7363b16218c599e5fa30: Status 404 returned error can't find the container with id e08b53a504d73ceab5d52f604bd765648f620ece0f1c7363b16218c599e5fa30 Feb 01 07:46:12 crc kubenswrapper[5127]: I0201 07:46:12.678248 5127 generic.go:334] "Generic (PLEG): container finished" podID="b159f979-52d8-49cc-aa99-1fd8395da992" containerID="6f49a3675128ebec0c86b462bc5da8d2b5d7ea75e8cb273ef3f672723f1d8ae8" exitCode=0 Feb 01 07:46:12 crc kubenswrapper[5127]: I0201 07:46:12.678349 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwtvl" event={"ID":"b159f979-52d8-49cc-aa99-1fd8395da992","Type":"ContainerDied","Data":"6f49a3675128ebec0c86b462bc5da8d2b5d7ea75e8cb273ef3f672723f1d8ae8"} Feb 01 07:46:12 crc kubenswrapper[5127]: I0201 07:46:12.678690 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwtvl" event={"ID":"b159f979-52d8-49cc-aa99-1fd8395da992","Type":"ContainerStarted","Data":"e08b53a504d73ceab5d52f604bd765648f620ece0f1c7363b16218c599e5fa30"} Feb 01 07:46:13 crc kubenswrapper[5127]: I0201 07:46:13.693974 5127 generic.go:334] "Generic (PLEG): container finished" podID="b159f979-52d8-49cc-aa99-1fd8395da992" containerID="4a803a58ed56caa252522b75e008bb1cb983012cc3baa25358e2a0acb76ebee8" exitCode=0 Feb 01 07:46:13 crc kubenswrapper[5127]: I0201 07:46:13.694167 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwtvl" event={"ID":"b159f979-52d8-49cc-aa99-1fd8395da992","Type":"ContainerDied","Data":"4a803a58ed56caa252522b75e008bb1cb983012cc3baa25358e2a0acb76ebee8"} Feb 01 07:46:14 crc kubenswrapper[5127]: I0201 07:46:14.704994 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwtvl" event={"ID":"b159f979-52d8-49cc-aa99-1fd8395da992","Type":"ContainerStarted","Data":"67438fb0d00691be59a0698a9710408bfbe26bd2f7e47cb39dd4e3540c272a1a"} Feb 01 07:46:14 crc kubenswrapper[5127]: I0201 07:46:14.730019 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dwtvl" podStartSLOduration=2.277937367 podStartE2EDuration="3.72999032s" podCreationTimestamp="2026-02-01 07:46:11 +0000 UTC" firstStartedPulling="2026-02-01 07:46:12.680262201 +0000 UTC m=+3523.166164595" lastFinishedPulling="2026-02-01 07:46:14.132315165 +0000 UTC m=+3524.618217548" observedRunningTime="2026-02-01 07:46:14.728354047 +0000 UTC m=+3525.214256450" watchObservedRunningTime="2026-02-01 07:46:14.72999032 +0000 UTC m=+3525.215892723" Feb 01 07:46:21 crc kubenswrapper[5127]: I0201 07:46:21.495407 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:21 crc kubenswrapper[5127]: I0201 07:46:21.496252 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:21 crc kubenswrapper[5127]: I0201 07:46:21.559059 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:21 crc kubenswrapper[5127]: I0201 07:46:21.820947 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:21 crc kubenswrapper[5127]: I0201 07:46:21.880735 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwtvl"] Feb 01 07:46:23 crc kubenswrapper[5127]: I0201 07:46:23.799201 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dwtvl" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="registry-server" containerID="cri-o://67438fb0d00691be59a0698a9710408bfbe26bd2f7e47cb39dd4e3540c272a1a" gracePeriod=2 Feb 01 07:46:24 crc kubenswrapper[5127]: I0201 07:46:24.812829 5127 generic.go:334] "Generic (PLEG): container finished" podID="b159f979-52d8-49cc-aa99-1fd8395da992" containerID="67438fb0d00691be59a0698a9710408bfbe26bd2f7e47cb39dd4e3540c272a1a" exitCode=0 Feb 01 07:46:24 crc kubenswrapper[5127]: I0201 07:46:24.812902 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwtvl" event={"ID":"b159f979-52d8-49cc-aa99-1fd8395da992","Type":"ContainerDied","Data":"67438fb0d00691be59a0698a9710408bfbe26bd2f7e47cb39dd4e3540c272a1a"} Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.441186 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.591200 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv85q\" (UniqueName: \"kubernetes.io/projected/b159f979-52d8-49cc-aa99-1fd8395da992-kube-api-access-rv85q\") pod \"b159f979-52d8-49cc-aa99-1fd8395da992\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.591340 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-utilities\") pod \"b159f979-52d8-49cc-aa99-1fd8395da992\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.591393 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-catalog-content\") pod \"b159f979-52d8-49cc-aa99-1fd8395da992\" (UID: \"b159f979-52d8-49cc-aa99-1fd8395da992\") " Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.599500 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-utilities" (OuterVolumeSpecName: "utilities") pod "b159f979-52d8-49cc-aa99-1fd8395da992" (UID: "b159f979-52d8-49cc-aa99-1fd8395da992"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.606980 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b159f979-52d8-49cc-aa99-1fd8395da992-kube-api-access-rv85q" (OuterVolumeSpecName: "kube-api-access-rv85q") pod "b159f979-52d8-49cc-aa99-1fd8395da992" (UID: "b159f979-52d8-49cc-aa99-1fd8395da992"). InnerVolumeSpecName "kube-api-access-rv85q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.645571 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b159f979-52d8-49cc-aa99-1fd8395da992" (UID: "b159f979-52d8-49cc-aa99-1fd8395da992"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.693282 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv85q\" (UniqueName: \"kubernetes.io/projected/b159f979-52d8-49cc-aa99-1fd8395da992-kube-api-access-rv85q\") on node \"crc\" DevicePath \"\"" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.693316 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.693326 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b159f979-52d8-49cc-aa99-1fd8395da992-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.825892 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dwtvl" event={"ID":"b159f979-52d8-49cc-aa99-1fd8395da992","Type":"ContainerDied","Data":"e08b53a504d73ceab5d52f604bd765648f620ece0f1c7363b16218c599e5fa30"} Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.825964 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dwtvl" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.826019 5127 scope.go:117] "RemoveContainer" containerID="67438fb0d00691be59a0698a9710408bfbe26bd2f7e47cb39dd4e3540c272a1a" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.863789 5127 scope.go:117] "RemoveContainer" containerID="4a803a58ed56caa252522b75e008bb1cb983012cc3baa25358e2a0acb76ebee8" Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.874365 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dwtvl"] Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.882684 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dwtvl"] Feb 01 07:46:25 crc kubenswrapper[5127]: I0201 07:46:25.899168 5127 scope.go:117] "RemoveContainer" containerID="6f49a3675128ebec0c86b462bc5da8d2b5d7ea75e8cb273ef3f672723f1d8ae8" Feb 01 07:46:26 crc kubenswrapper[5127]: I0201 07:46:26.251067 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" path="/var/lib/kubelet/pods/b159f979-52d8-49cc-aa99-1fd8395da992/volumes" Feb 01 07:47:06 crc kubenswrapper[5127]: I0201 07:47:06.741338 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:47:06 crc kubenswrapper[5127]: I0201 07:47:06.742143 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:47:36 crc kubenswrapper[5127]: I0201 07:47:36.741288 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:47:36 crc kubenswrapper[5127]: I0201 07:47:36.741991 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:48:06 crc kubenswrapper[5127]: I0201 07:48:06.740336 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:48:06 crc kubenswrapper[5127]: I0201 07:48:06.740939 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:48:06 crc kubenswrapper[5127]: I0201 07:48:06.741004 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:48:06 crc kubenswrapper[5127]: I0201 07:48:06.742132 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f2b7c60b4e95a6c9c01f0d4b0bddc2d95ecb74ead5fd68ec5315991b2f7d4c2"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:48:06 crc kubenswrapper[5127]: I0201 07:48:06.742324 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://9f2b7c60b4e95a6c9c01f0d4b0bddc2d95ecb74ead5fd68ec5315991b2f7d4c2" gracePeriod=600 Feb 01 07:48:07 crc kubenswrapper[5127]: I0201 07:48:07.756156 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="9f2b7c60b4e95a6c9c01f0d4b0bddc2d95ecb74ead5fd68ec5315991b2f7d4c2" exitCode=0 Feb 01 07:48:07 crc kubenswrapper[5127]: I0201 07:48:07.756310 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"9f2b7c60b4e95a6c9c01f0d4b0bddc2d95ecb74ead5fd68ec5315991b2f7d4c2"} Feb 01 07:48:07 crc kubenswrapper[5127]: I0201 07:48:07.756853 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0"} Feb 01 07:48:07 crc kubenswrapper[5127]: I0201 07:48:07.756897 5127 scope.go:117] "RemoveContainer" containerID="89993dda7ad9cddbe4754c9fd060f8931e5b9e7e9431e44295a53c08a9938119" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.042043 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n856l"] Feb 01 07:49:05 crc kubenswrapper[5127]: E0201 07:49:05.042758 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="extract-content" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.042770 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="extract-content" Feb 01 07:49:05 crc kubenswrapper[5127]: E0201 07:49:05.042793 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="registry-server" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.042799 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="registry-server" Feb 01 07:49:05 crc kubenswrapper[5127]: E0201 07:49:05.042808 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="extract-utilities" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.042814 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="extract-utilities" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.042949 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b159f979-52d8-49cc-aa99-1fd8395da992" containerName="registry-server" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.043917 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.076860 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n856l"] Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.122622 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-utilities\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.122849 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgr6l\" (UniqueName: \"kubernetes.io/projected/8409fde7-21d1-4a54-9d08-9c12f6f24610-kube-api-access-xgr6l\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.123038 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-catalog-content\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.224909 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-utilities\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.225445 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgr6l\" (UniqueName: \"kubernetes.io/projected/8409fde7-21d1-4a54-9d08-9c12f6f24610-kube-api-access-xgr6l\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.225388 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-utilities\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.225857 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-catalog-content\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.226390 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-catalog-content\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.246666 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgr6l\" (UniqueName: \"kubernetes.io/projected/8409fde7-21d1-4a54-9d08-9c12f6f24610-kube-api-access-xgr6l\") pod \"certified-operators-n856l\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.367709 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:05 crc kubenswrapper[5127]: I0201 07:49:05.683767 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n856l"] Feb 01 07:49:06 crc kubenswrapper[5127]: I0201 07:49:06.285306 5127 generic.go:334] "Generic (PLEG): container finished" podID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerID="6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b" exitCode=0 Feb 01 07:49:06 crc kubenswrapper[5127]: I0201 07:49:06.285416 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n856l" event={"ID":"8409fde7-21d1-4a54-9d08-9c12f6f24610","Type":"ContainerDied","Data":"6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b"} Feb 01 07:49:06 crc kubenswrapper[5127]: I0201 07:49:06.285644 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n856l" event={"ID":"8409fde7-21d1-4a54-9d08-9c12f6f24610","Type":"ContainerStarted","Data":"ffe5bd28814d5bd66517e5fe19247f34070bf17f8b0d503f87b5af6cb7e94c3e"} Feb 01 07:49:06 crc kubenswrapper[5127]: I0201 07:49:06.288509 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:49:07 crc kubenswrapper[5127]: I0201 07:49:07.293485 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n856l" event={"ID":"8409fde7-21d1-4a54-9d08-9c12f6f24610","Type":"ContainerStarted","Data":"ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7"} Feb 01 07:49:08 crc kubenswrapper[5127]: I0201 07:49:08.306138 5127 generic.go:334] "Generic (PLEG): container finished" podID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerID="ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7" exitCode=0 Feb 01 07:49:08 crc kubenswrapper[5127]: I0201 07:49:08.306258 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n856l" event={"ID":"8409fde7-21d1-4a54-9d08-9c12f6f24610","Type":"ContainerDied","Data":"ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7"} Feb 01 07:49:08 crc kubenswrapper[5127]: I0201 07:49:08.306661 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n856l" event={"ID":"8409fde7-21d1-4a54-9d08-9c12f6f24610","Type":"ContainerStarted","Data":"ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9"} Feb 01 07:49:08 crc kubenswrapper[5127]: I0201 07:49:08.371516 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n856l" podStartSLOduration=1.966131411 podStartE2EDuration="3.371494543s" podCreationTimestamp="2026-02-01 07:49:05 +0000 UTC" firstStartedPulling="2026-02-01 07:49:06.288084931 +0000 UTC m=+3696.773987334" lastFinishedPulling="2026-02-01 07:49:07.693448103 +0000 UTC m=+3698.179350466" observedRunningTime="2026-02-01 07:49:08.363984012 +0000 UTC m=+3698.849886375" watchObservedRunningTime="2026-02-01 07:49:08.371494543 +0000 UTC m=+3698.857396926" Feb 01 07:49:15 crc kubenswrapper[5127]: I0201 07:49:15.368930 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:15 crc kubenswrapper[5127]: I0201 07:49:15.370081 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:15 crc kubenswrapper[5127]: I0201 07:49:15.445353 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:16 crc kubenswrapper[5127]: I0201 07:49:16.434620 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:16 crc kubenswrapper[5127]: I0201 07:49:16.497038 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n856l"] Feb 01 07:49:18 crc kubenswrapper[5127]: I0201 07:49:18.395864 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n856l" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="registry-server" containerID="cri-o://ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9" gracePeriod=2 Feb 01 07:49:18 crc kubenswrapper[5127]: I0201 07:49:18.877886 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:18 crc kubenswrapper[5127]: I0201 07:49:18.946560 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgr6l\" (UniqueName: \"kubernetes.io/projected/8409fde7-21d1-4a54-9d08-9c12f6f24610-kube-api-access-xgr6l\") pod \"8409fde7-21d1-4a54-9d08-9c12f6f24610\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " Feb 01 07:49:18 crc kubenswrapper[5127]: I0201 07:49:18.946666 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-catalog-content\") pod \"8409fde7-21d1-4a54-9d08-9c12f6f24610\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " Feb 01 07:49:18 crc kubenswrapper[5127]: I0201 07:49:18.946789 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-utilities\") pod \"8409fde7-21d1-4a54-9d08-9c12f6f24610\" (UID: \"8409fde7-21d1-4a54-9d08-9c12f6f24610\") " Feb 01 07:49:18 crc kubenswrapper[5127]: I0201 07:49:18.948051 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-utilities" (OuterVolumeSpecName: "utilities") pod "8409fde7-21d1-4a54-9d08-9c12f6f24610" (UID: "8409fde7-21d1-4a54-9d08-9c12f6f24610"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:49:18 crc kubenswrapper[5127]: I0201 07:49:18.954997 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8409fde7-21d1-4a54-9d08-9c12f6f24610-kube-api-access-xgr6l" (OuterVolumeSpecName: "kube-api-access-xgr6l") pod "8409fde7-21d1-4a54-9d08-9c12f6f24610" (UID: "8409fde7-21d1-4a54-9d08-9c12f6f24610"). InnerVolumeSpecName "kube-api-access-xgr6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.023602 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8409fde7-21d1-4a54-9d08-9c12f6f24610" (UID: "8409fde7-21d1-4a54-9d08-9c12f6f24610"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.048246 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgr6l\" (UniqueName: \"kubernetes.io/projected/8409fde7-21d1-4a54-9d08-9c12f6f24610-kube-api-access-xgr6l\") on node \"crc\" DevicePath \"\"" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.048314 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.048346 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8409fde7-21d1-4a54-9d08-9c12f6f24610-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.410090 5127 generic.go:334] "Generic (PLEG): container finished" podID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerID="ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9" exitCode=0 Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.410154 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n856l" event={"ID":"8409fde7-21d1-4a54-9d08-9c12f6f24610","Type":"ContainerDied","Data":"ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9"} Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.410230 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n856l" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.410267 5127 scope.go:117] "RemoveContainer" containerID="ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.410245 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n856l" event={"ID":"8409fde7-21d1-4a54-9d08-9c12f6f24610","Type":"ContainerDied","Data":"ffe5bd28814d5bd66517e5fe19247f34070bf17f8b0d503f87b5af6cb7e94c3e"} Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.453958 5127 scope.go:117] "RemoveContainer" containerID="ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.468304 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n856l"] Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.479793 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n856l"] Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.494473 5127 scope.go:117] "RemoveContainer" containerID="6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.536937 5127 scope.go:117] "RemoveContainer" containerID="ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9" Feb 01 07:49:19 crc kubenswrapper[5127]: E0201 07:49:19.537661 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9\": container with ID starting with ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9 not found: ID does not exist" containerID="ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.537740 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9"} err="failed to get container status \"ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9\": rpc error: code = NotFound desc = could not find container \"ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9\": container with ID starting with ee0d2287860c4f05ed5ea9faadc6025770a6c90848ac0574553c0c1cdf98cfa9 not found: ID does not exist" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.537787 5127 scope.go:117] "RemoveContainer" containerID="ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7" Feb 01 07:49:19 crc kubenswrapper[5127]: E0201 07:49:19.538947 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7\": container with ID starting with ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7 not found: ID does not exist" containerID="ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.539012 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7"} err="failed to get container status \"ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7\": rpc error: code = NotFound desc = could not find container \"ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7\": container with ID starting with ec600fee9bbbf3c5f71a1e397e27e1801e8983e0902e142bf80236955f8ed0b7 not found: ID does not exist" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.539066 5127 scope.go:117] "RemoveContainer" containerID="6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b" Feb 01 07:49:19 crc kubenswrapper[5127]: E0201 07:49:19.539768 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b\": container with ID starting with 6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b not found: ID does not exist" containerID="6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b" Feb 01 07:49:19 crc kubenswrapper[5127]: I0201 07:49:19.539866 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b"} err="failed to get container status \"6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b\": rpc error: code = NotFound desc = could not find container \"6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b\": container with ID starting with 6b24f1395a8fd3bc6631d083a7894899440bf6c3a01114d35827eb77fbf6da7b not found: ID does not exist" Feb 01 07:49:20 crc kubenswrapper[5127]: I0201 07:49:20.251435 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" path="/var/lib/kubelet/pods/8409fde7-21d1-4a54-9d08-9c12f6f24610/volumes" Feb 01 07:50:36 crc kubenswrapper[5127]: I0201 07:50:36.740576 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:50:36 crc kubenswrapper[5127]: I0201 07:50:36.741363 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:51:06 crc kubenswrapper[5127]: I0201 07:51:06.741219 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:51:06 crc kubenswrapper[5127]: I0201 07:51:06.741922 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:51:36 crc kubenswrapper[5127]: I0201 07:51:36.741356 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:51:36 crc kubenswrapper[5127]: I0201 07:51:36.742264 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:51:36 crc kubenswrapper[5127]: I0201 07:51:36.742333 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 07:51:36 crc kubenswrapper[5127]: I0201 07:51:36.743269 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 07:51:36 crc kubenswrapper[5127]: I0201 07:51:36.743369 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" gracePeriod=600 Feb 01 07:51:36 crc kubenswrapper[5127]: E0201 07:51:36.870638 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:51:37 crc kubenswrapper[5127]: I0201 07:51:37.650155 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" exitCode=0 Feb 01 07:51:37 crc kubenswrapper[5127]: I0201 07:51:37.650237 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0"} Feb 01 07:51:37 crc kubenswrapper[5127]: I0201 07:51:37.650612 5127 scope.go:117] "RemoveContainer" containerID="9f2b7c60b4e95a6c9c01f0d4b0bddc2d95ecb74ead5fd68ec5315991b2f7d4c2" Feb 01 07:51:37 crc kubenswrapper[5127]: I0201 07:51:37.651999 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:51:37 crc kubenswrapper[5127]: E0201 07:51:37.652451 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:51:52 crc kubenswrapper[5127]: I0201 07:51:52.235671 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:51:52 crc kubenswrapper[5127]: E0201 07:51:52.236638 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:52:04 crc kubenswrapper[5127]: I0201 07:52:04.235894 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:52:04 crc kubenswrapper[5127]: E0201 07:52:04.236802 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:52:18 crc kubenswrapper[5127]: I0201 07:52:18.236430 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:52:18 crc kubenswrapper[5127]: E0201 07:52:18.237511 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:52:33 crc kubenswrapper[5127]: I0201 07:52:33.236340 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:52:33 crc kubenswrapper[5127]: E0201 07:52:33.237534 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:52:45 crc kubenswrapper[5127]: I0201 07:52:45.236276 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:52:45 crc kubenswrapper[5127]: E0201 07:52:45.237500 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:52:56 crc kubenswrapper[5127]: I0201 07:52:56.235744 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:52:56 crc kubenswrapper[5127]: E0201 07:52:56.236756 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:53:10 crc kubenswrapper[5127]: I0201 07:53:10.243207 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:53:10 crc kubenswrapper[5127]: E0201 07:53:10.244195 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:53:21 crc kubenswrapper[5127]: I0201 07:53:21.235642 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:53:21 crc kubenswrapper[5127]: E0201 07:53:21.236445 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:53:32 crc kubenswrapper[5127]: I0201 07:53:32.235498 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:53:32 crc kubenswrapper[5127]: E0201 07:53:32.236535 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:53:36 crc kubenswrapper[5127]: I0201 07:53:36.924409 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qnz7s"] Feb 01 07:53:36 crc kubenswrapper[5127]: E0201 07:53:36.924945 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="extract-utilities" Feb 01 07:53:36 crc kubenswrapper[5127]: I0201 07:53:36.924957 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="extract-utilities" Feb 01 07:53:36 crc kubenswrapper[5127]: E0201 07:53:36.924973 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="registry-server" Feb 01 07:53:36 crc kubenswrapper[5127]: I0201 07:53:36.924980 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="registry-server" Feb 01 07:53:36 crc kubenswrapper[5127]: E0201 07:53:36.925001 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="extract-content" Feb 01 07:53:36 crc kubenswrapper[5127]: I0201 07:53:36.925007 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="extract-content" Feb 01 07:53:36 crc kubenswrapper[5127]: I0201 07:53:36.925132 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8409fde7-21d1-4a54-9d08-9c12f6f24610" containerName="registry-server" Feb 01 07:53:36 crc kubenswrapper[5127]: I0201 07:53:36.926066 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:36 crc kubenswrapper[5127]: I0201 07:53:36.956878 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnz7s"] Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.078026 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-utilities\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.078115 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-catalog-content\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.078153 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpn9h\" (UniqueName: \"kubernetes.io/projected/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-kube-api-access-dpn9h\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.180957 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-utilities\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.181349 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-catalog-content\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.181500 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpn9h\" (UniqueName: \"kubernetes.io/projected/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-kube-api-access-dpn9h\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.181665 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-utilities\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.181843 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-catalog-content\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.201569 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpn9h\" (UniqueName: \"kubernetes.io/projected/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-kube-api-access-dpn9h\") pod \"redhat-operators-qnz7s\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.243125 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.678653 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qnz7s"] Feb 01 07:53:37 crc kubenswrapper[5127]: I0201 07:53:37.718250 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnz7s" event={"ID":"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed","Type":"ContainerStarted","Data":"c7d86cc2c18b43fb545c155ad283ef5b110adecb18e6df099c91b2618e7606ea"} Feb 01 07:53:38 crc kubenswrapper[5127]: I0201 07:53:38.725677 5127 generic.go:334] "Generic (PLEG): container finished" podID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerID="76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a" exitCode=0 Feb 01 07:53:38 crc kubenswrapper[5127]: I0201 07:53:38.725731 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnz7s" event={"ID":"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed","Type":"ContainerDied","Data":"76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a"} Feb 01 07:53:39 crc kubenswrapper[5127]: I0201 07:53:39.733337 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnz7s" event={"ID":"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed","Type":"ContainerStarted","Data":"7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4"} Feb 01 07:53:40 crc kubenswrapper[5127]: I0201 07:53:40.745849 5127 generic.go:334] "Generic (PLEG): container finished" podID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerID="7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4" exitCode=0 Feb 01 07:53:40 crc kubenswrapper[5127]: I0201 07:53:40.745923 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnz7s" event={"ID":"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed","Type":"ContainerDied","Data":"7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4"} Feb 01 07:53:41 crc kubenswrapper[5127]: I0201 07:53:41.756628 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnz7s" event={"ID":"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed","Type":"ContainerStarted","Data":"890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60"} Feb 01 07:53:41 crc kubenswrapper[5127]: I0201 07:53:41.773891 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qnz7s" podStartSLOduration=3.31595558 podStartE2EDuration="5.773873454s" podCreationTimestamp="2026-02-01 07:53:36 +0000 UTC" firstStartedPulling="2026-02-01 07:53:38.727107089 +0000 UTC m=+3969.213009462" lastFinishedPulling="2026-02-01 07:53:41.185024953 +0000 UTC m=+3971.670927336" observedRunningTime="2026-02-01 07:53:41.773421642 +0000 UTC m=+3972.259323995" watchObservedRunningTime="2026-02-01 07:53:41.773873454 +0000 UTC m=+3972.259775827" Feb 01 07:53:44 crc kubenswrapper[5127]: I0201 07:53:44.237527 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:53:44 crc kubenswrapper[5127]: E0201 07:53:44.238084 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:53:47 crc kubenswrapper[5127]: I0201 07:53:47.243610 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:47 crc kubenswrapper[5127]: I0201 07:53:47.244050 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:48 crc kubenswrapper[5127]: I0201 07:53:48.323518 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qnz7s" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="registry-server" probeResult="failure" output=< Feb 01 07:53:48 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 07:53:48 crc kubenswrapper[5127]: > Feb 01 07:53:56 crc kubenswrapper[5127]: I0201 07:53:56.236175 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:53:56 crc kubenswrapper[5127]: E0201 07:53:56.238523 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:53:57 crc kubenswrapper[5127]: I0201 07:53:57.306475 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:53:57 crc kubenswrapper[5127]: I0201 07:53:57.376140 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:54:00 crc kubenswrapper[5127]: I0201 07:54:00.810187 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnz7s"] Feb 01 07:54:00 crc kubenswrapper[5127]: I0201 07:54:00.811061 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qnz7s" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="registry-server" containerID="cri-o://890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60" gracePeriod=2 Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.248003 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.442390 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpn9h\" (UniqueName: \"kubernetes.io/projected/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-kube-api-access-dpn9h\") pod \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.442452 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-catalog-content\") pod \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.442487 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-utilities\") pod \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\" (UID: \"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed\") " Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.443476 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-utilities" (OuterVolumeSpecName: "utilities") pod "e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" (UID: "e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.447719 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-kube-api-access-dpn9h" (OuterVolumeSpecName: "kube-api-access-dpn9h") pod "e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" (UID: "e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed"). InnerVolumeSpecName "kube-api-access-dpn9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.543941 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpn9h\" (UniqueName: \"kubernetes.io/projected/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-kube-api-access-dpn9h\") on node \"crc\" DevicePath \"\"" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.543977 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.547391 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" (UID: "e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.644841 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.945686 5127 generic.go:334] "Generic (PLEG): container finished" podID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerID="890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60" exitCode=0 Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.945788 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qnz7s" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.945787 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnz7s" event={"ID":"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed","Type":"ContainerDied","Data":"890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60"} Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.946164 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qnz7s" event={"ID":"e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed","Type":"ContainerDied","Data":"c7d86cc2c18b43fb545c155ad283ef5b110adecb18e6df099c91b2618e7606ea"} Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.946231 5127 scope.go:117] "RemoveContainer" containerID="890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.972446 5127 scope.go:117] "RemoveContainer" containerID="7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4" Feb 01 07:54:01 crc kubenswrapper[5127]: I0201 07:54:01.999469 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qnz7s"] Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.004348 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qnz7s"] Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.026023 5127 scope.go:117] "RemoveContainer" containerID="76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a" Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.041653 5127 scope.go:117] "RemoveContainer" containerID="890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60" Feb 01 07:54:02 crc kubenswrapper[5127]: E0201 07:54:02.042420 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60\": container with ID starting with 890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60 not found: ID does not exist" containerID="890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60" Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.042459 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60"} err="failed to get container status \"890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60\": rpc error: code = NotFound desc = could not find container \"890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60\": container with ID starting with 890423ae733be481b6633182e0e49cf16a9b9dba601887d49169418768c18c60 not found: ID does not exist" Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.042483 5127 scope.go:117] "RemoveContainer" containerID="7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4" Feb 01 07:54:02 crc kubenswrapper[5127]: E0201 07:54:02.042823 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4\": container with ID starting with 7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4 not found: ID does not exist" containerID="7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4" Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.042882 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4"} err="failed to get container status \"7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4\": rpc error: code = NotFound desc = could not find container \"7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4\": container with ID starting with 7c712e37be5bf205c223cc25849f0cffede5543a6fffe85398c42cd47b6d4be4 not found: ID does not exist" Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.042918 5127 scope.go:117] "RemoveContainer" containerID="76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a" Feb 01 07:54:02 crc kubenswrapper[5127]: E0201 07:54:02.043303 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a\": container with ID starting with 76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a not found: ID does not exist" containerID="76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a" Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.043336 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a"} err="failed to get container status \"76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a\": rpc error: code = NotFound desc = could not find container \"76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a\": container with ID starting with 76634a3d4d0e3c7ee9afe45ef2f305ad8fcce86c7d59816bdd5cea63233be05a not found: ID does not exist" Feb 01 07:54:02 crc kubenswrapper[5127]: I0201 07:54:02.247385 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" path="/var/lib/kubelet/pods/e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed/volumes" Feb 01 07:54:11 crc kubenswrapper[5127]: I0201 07:54:11.235904 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:54:11 crc kubenswrapper[5127]: E0201 07:54:11.236610 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:54:24 crc kubenswrapper[5127]: I0201 07:54:24.236440 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:54:24 crc kubenswrapper[5127]: E0201 07:54:24.237259 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:54:36 crc kubenswrapper[5127]: I0201 07:54:36.236410 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:54:36 crc kubenswrapper[5127]: E0201 07:54:36.237727 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:54:49 crc kubenswrapper[5127]: I0201 07:54:49.236145 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:54:49 crc kubenswrapper[5127]: E0201 07:54:49.237766 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:55:02 crc kubenswrapper[5127]: I0201 07:55:02.236227 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:55:02 crc kubenswrapper[5127]: E0201 07:55:02.237367 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:55:13 crc kubenswrapper[5127]: I0201 07:55:13.235513 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:55:13 crc kubenswrapper[5127]: E0201 07:55:13.236308 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.222844 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlvb"] Feb 01 07:55:16 crc kubenswrapper[5127]: E0201 07:55:16.223979 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="extract-content" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.224003 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="extract-content" Feb 01 07:55:16 crc kubenswrapper[5127]: E0201 07:55:16.224037 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="registry-server" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.224049 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="registry-server" Feb 01 07:55:16 crc kubenswrapper[5127]: E0201 07:55:16.224106 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="extract-utilities" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.224121 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="extract-utilities" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.224507 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e22e25-c952-4bb3-8c7c-6b54e7aab3ed" containerName="registry-server" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.229847 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.264611 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlvb"] Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.388271 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-utilities\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.388320 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6sl\" (UniqueName: \"kubernetes.io/projected/ff782ac0-12f3-4067-94ca-28965602c223-kube-api-access-sq6sl\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.388437 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-catalog-content\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.490200 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-catalog-content\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.490257 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-utilities\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.490283 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6sl\" (UniqueName: \"kubernetes.io/projected/ff782ac0-12f3-4067-94ca-28965602c223-kube-api-access-sq6sl\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.490875 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-utilities\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.490963 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-catalog-content\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.515188 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6sl\" (UniqueName: \"kubernetes.io/projected/ff782ac0-12f3-4067-94ca-28965602c223-kube-api-access-sq6sl\") pod \"redhat-marketplace-6dlvb\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.571355 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:16 crc kubenswrapper[5127]: I0201 07:55:16.902333 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlvb"] Feb 01 07:55:17 crc kubenswrapper[5127]: I0201 07:55:17.614626 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff782ac0-12f3-4067-94ca-28965602c223" containerID="8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8" exitCode=0 Feb 01 07:55:17 crc kubenswrapper[5127]: I0201 07:55:17.614689 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlvb" event={"ID":"ff782ac0-12f3-4067-94ca-28965602c223","Type":"ContainerDied","Data":"8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8"} Feb 01 07:55:17 crc kubenswrapper[5127]: I0201 07:55:17.614729 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlvb" event={"ID":"ff782ac0-12f3-4067-94ca-28965602c223","Type":"ContainerStarted","Data":"01336741c7ef9b2be39a24157bb1d7f9d3e108361f8902fc3636deaebccfc7ed"} Feb 01 07:55:17 crc kubenswrapper[5127]: I0201 07:55:17.618343 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 07:55:18 crc kubenswrapper[5127]: I0201 07:55:18.628457 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff782ac0-12f3-4067-94ca-28965602c223" containerID="f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf" exitCode=0 Feb 01 07:55:18 crc kubenswrapper[5127]: I0201 07:55:18.628520 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlvb" event={"ID":"ff782ac0-12f3-4067-94ca-28965602c223","Type":"ContainerDied","Data":"f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf"} Feb 01 07:55:20 crc kubenswrapper[5127]: I0201 07:55:20.651084 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlvb" event={"ID":"ff782ac0-12f3-4067-94ca-28965602c223","Type":"ContainerStarted","Data":"c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea"} Feb 01 07:55:20 crc kubenswrapper[5127]: I0201 07:55:20.681902 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6dlvb" podStartSLOduration=2.298999898 podStartE2EDuration="4.681874372s" podCreationTimestamp="2026-02-01 07:55:16 +0000 UTC" firstStartedPulling="2026-02-01 07:55:17.617888654 +0000 UTC m=+4068.103791057" lastFinishedPulling="2026-02-01 07:55:20.000763138 +0000 UTC m=+4070.486665531" observedRunningTime="2026-02-01 07:55:20.677391841 +0000 UTC m=+4071.163294274" watchObservedRunningTime="2026-02-01 07:55:20.681874372 +0000 UTC m=+4071.167776775" Feb 01 07:55:26 crc kubenswrapper[5127]: I0201 07:55:26.235230 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:55:26 crc kubenswrapper[5127]: E0201 07:55:26.235890 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:55:26 crc kubenswrapper[5127]: I0201 07:55:26.571989 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:26 crc kubenswrapper[5127]: I0201 07:55:26.572063 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:26 crc kubenswrapper[5127]: I0201 07:55:26.638260 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:26 crc kubenswrapper[5127]: I0201 07:55:26.765283 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:26 crc kubenswrapper[5127]: I0201 07:55:26.878430 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlvb"] Feb 01 07:55:29 crc kubenswrapper[5127]: I0201 07:55:29.052842 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6dlvb" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="registry-server" containerID="cri-o://c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea" gracePeriod=2 Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.032792 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.070493 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff782ac0-12f3-4067-94ca-28965602c223" containerID="c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea" exitCode=0 Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.070608 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dlvb" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.070583 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlvb" event={"ID":"ff782ac0-12f3-4067-94ca-28965602c223","Type":"ContainerDied","Data":"c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea"} Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.070768 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dlvb" event={"ID":"ff782ac0-12f3-4067-94ca-28965602c223","Type":"ContainerDied","Data":"01336741c7ef9b2be39a24157bb1d7f9d3e108361f8902fc3636deaebccfc7ed"} Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.070811 5127 scope.go:117] "RemoveContainer" containerID="c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.093932 5127 scope.go:117] "RemoveContainer" containerID="f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.116280 5127 scope.go:117] "RemoveContainer" containerID="8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.152761 5127 scope.go:117] "RemoveContainer" containerID="c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea" Feb 01 07:55:30 crc kubenswrapper[5127]: E0201 07:55:30.153358 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea\": container with ID starting with c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea not found: ID does not exist" containerID="c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.153400 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea"} err="failed to get container status \"c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea\": rpc error: code = NotFound desc = could not find container \"c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea\": container with ID starting with c9fd9baac4afbb84fb3583e057faa184f5bf5e0919f822b05301df5da0326aea not found: ID does not exist" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.153427 5127 scope.go:117] "RemoveContainer" containerID="f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf" Feb 01 07:55:30 crc kubenswrapper[5127]: E0201 07:55:30.153893 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf\": container with ID starting with f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf not found: ID does not exist" containerID="f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.153947 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf"} err="failed to get container status \"f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf\": rpc error: code = NotFound desc = could not find container \"f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf\": container with ID starting with f9b5089d5d96edd59e113ddebb6486e937446a230f29d49a79fe7c41063a82bf not found: ID does not exist" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.153981 5127 scope.go:117] "RemoveContainer" containerID="8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8" Feb 01 07:55:30 crc kubenswrapper[5127]: E0201 07:55:30.155150 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8\": container with ID starting with 8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8 not found: ID does not exist" containerID="8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.155265 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8"} err="failed to get container status \"8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8\": rpc error: code = NotFound desc = could not find container \"8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8\": container with ID starting with 8c40869ae3b34ddcdc9dab487c5f0cc28f221cb9dd80bded1893e664df3699b8 not found: ID does not exist" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.175130 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-catalog-content\") pod \"ff782ac0-12f3-4067-94ca-28965602c223\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.175250 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq6sl\" (UniqueName: \"kubernetes.io/projected/ff782ac0-12f3-4067-94ca-28965602c223-kube-api-access-sq6sl\") pod \"ff782ac0-12f3-4067-94ca-28965602c223\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.175306 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-utilities\") pod \"ff782ac0-12f3-4067-94ca-28965602c223\" (UID: \"ff782ac0-12f3-4067-94ca-28965602c223\") " Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.177175 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-utilities" (OuterVolumeSpecName: "utilities") pod "ff782ac0-12f3-4067-94ca-28965602c223" (UID: "ff782ac0-12f3-4067-94ca-28965602c223"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.182187 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff782ac0-12f3-4067-94ca-28965602c223-kube-api-access-sq6sl" (OuterVolumeSpecName: "kube-api-access-sq6sl") pod "ff782ac0-12f3-4067-94ca-28965602c223" (UID: "ff782ac0-12f3-4067-94ca-28965602c223"). InnerVolumeSpecName "kube-api-access-sq6sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.205463 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff782ac0-12f3-4067-94ca-28965602c223" (UID: "ff782ac0-12f3-4067-94ca-28965602c223"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.278013 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq6sl\" (UniqueName: \"kubernetes.io/projected/ff782ac0-12f3-4067-94ca-28965602c223-kube-api-access-sq6sl\") on node \"crc\" DevicePath \"\"" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.278409 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.278623 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff782ac0-12f3-4067-94ca-28965602c223-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.392685 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlvb"] Feb 01 07:55:30 crc kubenswrapper[5127]: I0201 07:55:30.399486 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dlvb"] Feb 01 07:55:32 crc kubenswrapper[5127]: I0201 07:55:32.251806 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff782ac0-12f3-4067-94ca-28965602c223" path="/var/lib/kubelet/pods/ff782ac0-12f3-4067-94ca-28965602c223/volumes" Feb 01 07:55:38 crc kubenswrapper[5127]: I0201 07:55:38.235806 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:55:38 crc kubenswrapper[5127]: E0201 07:55:38.236549 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:55:52 crc kubenswrapper[5127]: I0201 07:55:52.235475 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:55:52 crc kubenswrapper[5127]: E0201 07:55:52.239318 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:56:04 crc kubenswrapper[5127]: I0201 07:56:04.236541 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:56:04 crc kubenswrapper[5127]: E0201 07:56:04.237569 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:56:18 crc kubenswrapper[5127]: I0201 07:56:18.235890 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:56:18 crc kubenswrapper[5127]: E0201 07:56:18.236952 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:56:32 crc kubenswrapper[5127]: I0201 07:56:32.235953 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:56:32 crc kubenswrapper[5127]: E0201 07:56:32.236672 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 07:56:43 crc kubenswrapper[5127]: I0201 07:56:43.235981 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 07:56:43 crc kubenswrapper[5127]: I0201 07:56:43.817894 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"0d8ab4e73f6b38988fd588e736cc441fc650c1bdf535335926dfb8d1dc40f0c4"} Feb 01 07:59:06 crc kubenswrapper[5127]: I0201 07:59:06.740924 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:59:06 crc kubenswrapper[5127]: I0201 07:59:06.741859 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.030283 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jdf9m"] Feb 01 07:59:23 crc kubenswrapper[5127]: E0201 07:59:23.031562 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="registry-server" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.031638 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="registry-server" Feb 01 07:59:23 crc kubenswrapper[5127]: E0201 07:59:23.031712 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="extract-utilities" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.031730 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="extract-utilities" Feb 01 07:59:23 crc kubenswrapper[5127]: E0201 07:59:23.031757 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="extract-content" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.031780 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="extract-content" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.032119 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff782ac0-12f3-4067-94ca-28965602c223" containerName="registry-server" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.034432 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.075880 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdf9m"] Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.199270 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-catalog-content\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.199402 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7d5z\" (UniqueName: \"kubernetes.io/projected/89707d12-c899-4a37-8470-4be9fe303fcc-kube-api-access-w7d5z\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.199959 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-utilities\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.301865 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-catalog-content\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.302115 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7d5z\" (UniqueName: \"kubernetes.io/projected/89707d12-c899-4a37-8470-4be9fe303fcc-kube-api-access-w7d5z\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.302197 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-utilities\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.302535 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-catalog-content\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.303010 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-utilities\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.330826 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7d5z\" (UniqueName: \"kubernetes.io/projected/89707d12-c899-4a37-8470-4be9fe303fcc-kube-api-access-w7d5z\") pod \"certified-operators-jdf9m\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.401896 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:23 crc kubenswrapper[5127]: I0201 07:59:23.846529 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jdf9m"] Feb 01 07:59:24 crc kubenswrapper[5127]: I0201 07:59:24.252262 5127 generic.go:334] "Generic (PLEG): container finished" podID="89707d12-c899-4a37-8470-4be9fe303fcc" containerID="be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe" exitCode=0 Feb 01 07:59:24 crc kubenswrapper[5127]: I0201 07:59:24.253058 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdf9m" event={"ID":"89707d12-c899-4a37-8470-4be9fe303fcc","Type":"ContainerDied","Data":"be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe"} Feb 01 07:59:24 crc kubenswrapper[5127]: I0201 07:59:24.253091 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdf9m" event={"ID":"89707d12-c899-4a37-8470-4be9fe303fcc","Type":"ContainerStarted","Data":"f735517ced8e203ad4ea4d4d0a8a932b813c273f5913befdbaff66b39a9d38ca"} Feb 01 07:59:26 crc kubenswrapper[5127]: I0201 07:59:26.273543 5127 generic.go:334] "Generic (PLEG): container finished" podID="89707d12-c899-4a37-8470-4be9fe303fcc" containerID="948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5" exitCode=0 Feb 01 07:59:26 crc kubenswrapper[5127]: I0201 07:59:26.273674 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdf9m" event={"ID":"89707d12-c899-4a37-8470-4be9fe303fcc","Type":"ContainerDied","Data":"948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5"} Feb 01 07:59:27 crc kubenswrapper[5127]: I0201 07:59:27.284273 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdf9m" event={"ID":"89707d12-c899-4a37-8470-4be9fe303fcc","Type":"ContainerStarted","Data":"ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf"} Feb 01 07:59:27 crc kubenswrapper[5127]: I0201 07:59:27.329364 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jdf9m" podStartSLOduration=2.888187914 podStartE2EDuration="5.329343427s" podCreationTimestamp="2026-02-01 07:59:22 +0000 UTC" firstStartedPulling="2026-02-01 07:59:24.25466754 +0000 UTC m=+4314.740569903" lastFinishedPulling="2026-02-01 07:59:26.695823013 +0000 UTC m=+4317.181725416" observedRunningTime="2026-02-01 07:59:27.322091071 +0000 UTC m=+4317.807993434" watchObservedRunningTime="2026-02-01 07:59:27.329343427 +0000 UTC m=+4317.815245800" Feb 01 07:59:33 crc kubenswrapper[5127]: I0201 07:59:33.403136 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:33 crc kubenswrapper[5127]: I0201 07:59:33.404894 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:33 crc kubenswrapper[5127]: I0201 07:59:33.467286 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:34 crc kubenswrapper[5127]: I0201 07:59:34.427388 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:34 crc kubenswrapper[5127]: I0201 07:59:34.524956 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdf9m"] Feb 01 07:59:36 crc kubenswrapper[5127]: I0201 07:59:36.374868 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jdf9m" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="registry-server" containerID="cri-o://ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf" gracePeriod=2 Feb 01 07:59:36 crc kubenswrapper[5127]: I0201 07:59:36.748747 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 07:59:36 crc kubenswrapper[5127]: I0201 07:59:36.749168 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 07:59:36 crc kubenswrapper[5127]: I0201 07:59:36.942449 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.019749 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7d5z\" (UniqueName: \"kubernetes.io/projected/89707d12-c899-4a37-8470-4be9fe303fcc-kube-api-access-w7d5z\") pod \"89707d12-c899-4a37-8470-4be9fe303fcc\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.019895 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-catalog-content\") pod \"89707d12-c899-4a37-8470-4be9fe303fcc\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.020062 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-utilities\") pod \"89707d12-c899-4a37-8470-4be9fe303fcc\" (UID: \"89707d12-c899-4a37-8470-4be9fe303fcc\") " Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.021218 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-utilities" (OuterVolumeSpecName: "utilities") pod "89707d12-c899-4a37-8470-4be9fe303fcc" (UID: "89707d12-c899-4a37-8470-4be9fe303fcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.031373 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89707d12-c899-4a37-8470-4be9fe303fcc-kube-api-access-w7d5z" (OuterVolumeSpecName: "kube-api-access-w7d5z") pod "89707d12-c899-4a37-8470-4be9fe303fcc" (UID: "89707d12-c899-4a37-8470-4be9fe303fcc"). InnerVolumeSpecName "kube-api-access-w7d5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.109059 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89707d12-c899-4a37-8470-4be9fe303fcc" (UID: "89707d12-c899-4a37-8470-4be9fe303fcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.121908 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.121952 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89707d12-c899-4a37-8470-4be9fe303fcc-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.121969 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7d5z\" (UniqueName: \"kubernetes.io/projected/89707d12-c899-4a37-8470-4be9fe303fcc-kube-api-access-w7d5z\") on node \"crc\" DevicePath \"\"" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.386139 5127 generic.go:334] "Generic (PLEG): container finished" podID="89707d12-c899-4a37-8470-4be9fe303fcc" containerID="ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf" exitCode=0 Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.386209 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdf9m" event={"ID":"89707d12-c899-4a37-8470-4be9fe303fcc","Type":"ContainerDied","Data":"ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf"} Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.386251 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jdf9m" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.386287 5127 scope.go:117] "RemoveContainer" containerID="ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.386267 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jdf9m" event={"ID":"89707d12-c899-4a37-8470-4be9fe303fcc","Type":"ContainerDied","Data":"f735517ced8e203ad4ea4d4d0a8a932b813c273f5913befdbaff66b39a9d38ca"} Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.421200 5127 scope.go:117] "RemoveContainer" containerID="948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.442635 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jdf9m"] Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.457012 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jdf9m"] Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.473727 5127 scope.go:117] "RemoveContainer" containerID="be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.512004 5127 scope.go:117] "RemoveContainer" containerID="ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf" Feb 01 07:59:37 crc kubenswrapper[5127]: E0201 07:59:37.512715 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf\": container with ID starting with ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf not found: ID does not exist" containerID="ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.512765 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf"} err="failed to get container status \"ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf\": rpc error: code = NotFound desc = could not find container \"ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf\": container with ID starting with ae7729faca7511536a1971fdc5bb7fe7da8a94a9f9502ecd85a88d3239f6e4bf not found: ID does not exist" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.512790 5127 scope.go:117] "RemoveContainer" containerID="948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5" Feb 01 07:59:37 crc kubenswrapper[5127]: E0201 07:59:37.513172 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5\": container with ID starting with 948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5 not found: ID does not exist" containerID="948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.513205 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5"} err="failed to get container status \"948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5\": rpc error: code = NotFound desc = could not find container \"948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5\": container with ID starting with 948d38543fdd77c5fb681457cc9391f0ac92ac1d2e02806691bdaf63e36bf7e5 not found: ID does not exist" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.513225 5127 scope.go:117] "RemoveContainer" containerID="be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe" Feb 01 07:59:37 crc kubenswrapper[5127]: E0201 07:59:37.513545 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe\": container with ID starting with be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe not found: ID does not exist" containerID="be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe" Feb 01 07:59:37 crc kubenswrapper[5127]: I0201 07:59:37.513566 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe"} err="failed to get container status \"be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe\": rpc error: code = NotFound desc = could not find container \"be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe\": container with ID starting with be58c71003e2a8a64d6a7157527abdd5e5ada8b75403cda05fe2266bc64daafe not found: ID does not exist" Feb 01 07:59:38 crc kubenswrapper[5127]: I0201 07:59:38.250307 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" path="/var/lib/kubelet/pods/89707d12-c899-4a37-8470-4be9fe303fcc/volumes" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.221390 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb"] Feb 01 08:00:00 crc kubenswrapper[5127]: E0201 08:00:00.222372 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="extract-content" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.222394 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="extract-content" Feb 01 08:00:00 crc kubenswrapper[5127]: E0201 08:00:00.222438 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="registry-server" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.222453 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="registry-server" Feb 01 08:00:00 crc kubenswrapper[5127]: E0201 08:00:00.222484 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="extract-utilities" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.222497 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="extract-utilities" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.222789 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="89707d12-c899-4a37-8470-4be9fe303fcc" containerName="registry-server" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.223497 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.226664 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.227240 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.234134 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb"] Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.401444 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e04d54-d98a-4b53-85b8-70986f9336c0-config-volume\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.401618 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zcpn\" (UniqueName: \"kubernetes.io/projected/c3e04d54-d98a-4b53-85b8-70986f9336c0-kube-api-access-9zcpn\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.401663 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e04d54-d98a-4b53-85b8-70986f9336c0-secret-volume\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.503051 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e04d54-d98a-4b53-85b8-70986f9336c0-secret-volume\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.503142 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e04d54-d98a-4b53-85b8-70986f9336c0-config-volume\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.503224 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zcpn\" (UniqueName: \"kubernetes.io/projected/c3e04d54-d98a-4b53-85b8-70986f9336c0-kube-api-access-9zcpn\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.504193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e04d54-d98a-4b53-85b8-70986f9336c0-config-volume\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.511671 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e04d54-d98a-4b53-85b8-70986f9336c0-secret-volume\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.537452 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zcpn\" (UniqueName: \"kubernetes.io/projected/c3e04d54-d98a-4b53-85b8-70986f9336c0-kube-api-access-9zcpn\") pod \"collect-profiles-29498880-66frb\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:00 crc kubenswrapper[5127]: I0201 08:00:00.562160 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:01 crc kubenswrapper[5127]: I0201 08:00:01.106703 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb"] Feb 01 08:00:01 crc kubenswrapper[5127]: I0201 08:00:01.625224 5127 generic.go:334] "Generic (PLEG): container finished" podID="c3e04d54-d98a-4b53-85b8-70986f9336c0" containerID="6eebf3f94eb7e4f54e2885f2f1dafa042ce828d929b9877695d271cd1dab33ad" exitCode=0 Feb 01 08:00:01 crc kubenswrapper[5127]: I0201 08:00:01.625286 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" event={"ID":"c3e04d54-d98a-4b53-85b8-70986f9336c0","Type":"ContainerDied","Data":"6eebf3f94eb7e4f54e2885f2f1dafa042ce828d929b9877695d271cd1dab33ad"} Feb 01 08:00:01 crc kubenswrapper[5127]: I0201 08:00:01.625672 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" event={"ID":"c3e04d54-d98a-4b53-85b8-70986f9336c0","Type":"ContainerStarted","Data":"0164841487f52c28f11bdc8fdc5fcc6af771d2b83ff1e38d9612253dea64a942"} Feb 01 08:00:02 crc kubenswrapper[5127]: I0201 08:00:02.955346 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.039315 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e04d54-d98a-4b53-85b8-70986f9336c0-config-volume\") pod \"c3e04d54-d98a-4b53-85b8-70986f9336c0\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.039630 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e04d54-d98a-4b53-85b8-70986f9336c0-secret-volume\") pod \"c3e04d54-d98a-4b53-85b8-70986f9336c0\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.039741 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zcpn\" (UniqueName: \"kubernetes.io/projected/c3e04d54-d98a-4b53-85b8-70986f9336c0-kube-api-access-9zcpn\") pod \"c3e04d54-d98a-4b53-85b8-70986f9336c0\" (UID: \"c3e04d54-d98a-4b53-85b8-70986f9336c0\") " Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.040445 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3e04d54-d98a-4b53-85b8-70986f9336c0-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3e04d54-d98a-4b53-85b8-70986f9336c0" (UID: "c3e04d54-d98a-4b53-85b8-70986f9336c0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.046576 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3e04d54-d98a-4b53-85b8-70986f9336c0-kube-api-access-9zcpn" (OuterVolumeSpecName: "kube-api-access-9zcpn") pod "c3e04d54-d98a-4b53-85b8-70986f9336c0" (UID: "c3e04d54-d98a-4b53-85b8-70986f9336c0"). InnerVolumeSpecName "kube-api-access-9zcpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.046936 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3e04d54-d98a-4b53-85b8-70986f9336c0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c3e04d54-d98a-4b53-85b8-70986f9336c0" (UID: "c3e04d54-d98a-4b53-85b8-70986f9336c0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.141702 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3e04d54-d98a-4b53-85b8-70986f9336c0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.141738 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3e04d54-d98a-4b53-85b8-70986f9336c0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.141752 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zcpn\" (UniqueName: \"kubernetes.io/projected/c3e04d54-d98a-4b53-85b8-70986f9336c0-kube-api-access-9zcpn\") on node \"crc\" DevicePath \"\"" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.644961 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" event={"ID":"c3e04d54-d98a-4b53-85b8-70986f9336c0","Type":"ContainerDied","Data":"0164841487f52c28f11bdc8fdc5fcc6af771d2b83ff1e38d9612253dea64a942"} Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.645011 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0164841487f52c28f11bdc8fdc5fcc6af771d2b83ff1e38d9612253dea64a942" Feb 01 08:00:03 crc kubenswrapper[5127]: I0201 08:00:03.645048 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb" Feb 01 08:00:04 crc kubenswrapper[5127]: I0201 08:00:04.038408 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9"] Feb 01 08:00:04 crc kubenswrapper[5127]: I0201 08:00:04.045674 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498835-m57d9"] Feb 01 08:00:04 crc kubenswrapper[5127]: I0201 08:00:04.243898 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b" path="/var/lib/kubelet/pods/a35f48df-5c5c-4dd9-83c3-4bf6c62bb44b/volumes" Feb 01 08:00:06 crc kubenswrapper[5127]: I0201 08:00:06.741173 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:00:06 crc kubenswrapper[5127]: I0201 08:00:06.741955 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:00:06 crc kubenswrapper[5127]: I0201 08:00:06.742054 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:00:06 crc kubenswrapper[5127]: I0201 08:00:06.743289 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d8ab4e73f6b38988fd588e736cc441fc650c1bdf535335926dfb8d1dc40f0c4"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:00:06 crc kubenswrapper[5127]: I0201 08:00:06.743675 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://0d8ab4e73f6b38988fd588e736cc441fc650c1bdf535335926dfb8d1dc40f0c4" gracePeriod=600 Feb 01 08:00:07 crc kubenswrapper[5127]: I0201 08:00:07.686987 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="0d8ab4e73f6b38988fd588e736cc441fc650c1bdf535335926dfb8d1dc40f0c4" exitCode=0 Feb 01 08:00:07 crc kubenswrapper[5127]: I0201 08:00:07.687073 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"0d8ab4e73f6b38988fd588e736cc441fc650c1bdf535335926dfb8d1dc40f0c4"} Feb 01 08:00:07 crc kubenswrapper[5127]: I0201 08:00:07.688156 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4"} Feb 01 08:00:07 crc kubenswrapper[5127]: I0201 08:00:07.688202 5127 scope.go:117] "RemoveContainer" containerID="8852d2b1b43dd61404d386d18e8988b02d4b8af1f7501a32530c964f3121edc0" Feb 01 08:00:41 crc kubenswrapper[5127]: I0201 08:00:41.099845 5127 scope.go:117] "RemoveContainer" containerID="22dc9a98a17b06e1c8d657e2785592225469cee52b8a58d3e627bce4bf34b3fb" Feb 01 08:02:36 crc kubenswrapper[5127]: I0201 08:02:36.741405 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:02:36 crc kubenswrapper[5127]: I0201 08:02:36.742547 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:03:06 crc kubenswrapper[5127]: I0201 08:03:06.741460 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:03:06 crc kubenswrapper[5127]: I0201 08:03:06.742218 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:03:36 crc kubenswrapper[5127]: I0201 08:03:36.741234 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:03:36 crc kubenswrapper[5127]: I0201 08:03:36.742205 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:03:36 crc kubenswrapper[5127]: I0201 08:03:36.742291 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:03:36 crc kubenswrapper[5127]: I0201 08:03:36.743354 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:03:36 crc kubenswrapper[5127]: I0201 08:03:36.743495 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" gracePeriod=600 Feb 01 08:03:36 crc kubenswrapper[5127]: E0201 08:03:36.874344 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:03:37 crc kubenswrapper[5127]: I0201 08:03:37.623425 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" exitCode=0 Feb 01 08:03:37 crc kubenswrapper[5127]: I0201 08:03:37.623510 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4"} Feb 01 08:03:37 crc kubenswrapper[5127]: I0201 08:03:37.623818 5127 scope.go:117] "RemoveContainer" containerID="0d8ab4e73f6b38988fd588e736cc441fc650c1bdf535335926dfb8d1dc40f0c4" Feb 01 08:03:37 crc kubenswrapper[5127]: I0201 08:03:37.624293 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:03:37 crc kubenswrapper[5127]: E0201 08:03:37.624517 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:03:50 crc kubenswrapper[5127]: I0201 08:03:50.244168 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:03:50 crc kubenswrapper[5127]: E0201 08:03:50.245236 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:04:04 crc kubenswrapper[5127]: I0201 08:04:04.236280 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:04:04 crc kubenswrapper[5127]: E0201 08:04:04.237115 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:04:16 crc kubenswrapper[5127]: I0201 08:04:16.235824 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:04:16 crc kubenswrapper[5127]: E0201 08:04:16.238028 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.507981 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cknvj"] Feb 01 08:04:24 crc kubenswrapper[5127]: E0201 08:04:24.508949 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3e04d54-d98a-4b53-85b8-70986f9336c0" containerName="collect-profiles" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.508966 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3e04d54-d98a-4b53-85b8-70986f9336c0" containerName="collect-profiles" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.509148 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3e04d54-d98a-4b53-85b8-70986f9336c0" containerName="collect-profiles" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.510328 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.533950 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cknvj"] Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.694102 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-utilities\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.694525 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-catalog-content\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.694643 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5snt2\" (UniqueName: \"kubernetes.io/projected/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-kube-api-access-5snt2\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.796343 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5snt2\" (UniqueName: \"kubernetes.io/projected/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-kube-api-access-5snt2\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.796500 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-utilities\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.796550 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-catalog-content\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.797267 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-utilities\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.797355 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-catalog-content\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.823811 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5snt2\" (UniqueName: \"kubernetes.io/projected/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-kube-api-access-5snt2\") pod \"redhat-operators-cknvj\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:24 crc kubenswrapper[5127]: I0201 08:04:24.853738 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:25 crc kubenswrapper[5127]: I0201 08:04:25.275719 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cknvj"] Feb 01 08:04:25 crc kubenswrapper[5127]: W0201 08:04:25.284889 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65510a8c_3e0d_40ba_8fe0_83c74eb36b0e.slice/crio-16381080ba54bd5855fff779ed9b95a81a88ee084086025ddf30e761c010198e WatchSource:0}: Error finding container 16381080ba54bd5855fff779ed9b95a81a88ee084086025ddf30e761c010198e: Status 404 returned error can't find the container with id 16381080ba54bd5855fff779ed9b95a81a88ee084086025ddf30e761c010198e Feb 01 08:04:26 crc kubenswrapper[5127]: I0201 08:04:26.078269 5127 generic.go:334] "Generic (PLEG): container finished" podID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerID="989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6" exitCode=0 Feb 01 08:04:26 crc kubenswrapper[5127]: I0201 08:04:26.078357 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cknvj" event={"ID":"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e","Type":"ContainerDied","Data":"989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6"} Feb 01 08:04:26 crc kubenswrapper[5127]: I0201 08:04:26.078661 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cknvj" event={"ID":"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e","Type":"ContainerStarted","Data":"16381080ba54bd5855fff779ed9b95a81a88ee084086025ddf30e761c010198e"} Feb 01 08:04:26 crc kubenswrapper[5127]: I0201 08:04:26.080622 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:04:26 crc kubenswrapper[5127]: I0201 08:04:26.899328 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-777tc"] Feb 01 08:04:26 crc kubenswrapper[5127]: I0201 08:04:26.902500 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:26 crc kubenswrapper[5127]: I0201 08:04:26.921196 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-777tc"] Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.032150 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-utilities\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.032242 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-catalog-content\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.032284 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzb57\" (UniqueName: \"kubernetes.io/projected/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-kube-api-access-gzb57\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.133203 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-catalog-content\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.133249 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzb57\" (UniqueName: \"kubernetes.io/projected/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-kube-api-access-gzb57\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.133334 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-utilities\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.133747 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-catalog-content\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.133803 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-utilities\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.155378 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzb57\" (UniqueName: \"kubernetes.io/projected/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-kube-api-access-gzb57\") pod \"community-operators-777tc\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.240554 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:27 crc kubenswrapper[5127]: I0201 08:04:27.732107 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-777tc"] Feb 01 08:04:28 crc kubenswrapper[5127]: I0201 08:04:28.095178 5127 generic.go:334] "Generic (PLEG): container finished" podID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerID="fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20" exitCode=0 Feb 01 08:04:28 crc kubenswrapper[5127]: I0201 08:04:28.095281 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-777tc" event={"ID":"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4","Type":"ContainerDied","Data":"fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20"} Feb 01 08:04:28 crc kubenswrapper[5127]: I0201 08:04:28.095932 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-777tc" event={"ID":"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4","Type":"ContainerStarted","Data":"b4e8bf8e14bb281f8f0632a3207bb96bc4a01d54ed8957d27bacc362e2259c17"} Feb 01 08:04:28 crc kubenswrapper[5127]: I0201 08:04:28.101843 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cknvj" event={"ID":"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e","Type":"ContainerStarted","Data":"4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b"} Feb 01 08:04:29 crc kubenswrapper[5127]: I0201 08:04:29.115069 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-777tc" event={"ID":"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4","Type":"ContainerStarted","Data":"8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a"} Feb 01 08:04:29 crc kubenswrapper[5127]: I0201 08:04:29.117019 5127 generic.go:334] "Generic (PLEG): container finished" podID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerID="4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b" exitCode=0 Feb 01 08:04:29 crc kubenswrapper[5127]: I0201 08:04:29.117061 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cknvj" event={"ID":"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e","Type":"ContainerDied","Data":"4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b"} Feb 01 08:04:30 crc kubenswrapper[5127]: I0201 08:04:30.128801 5127 generic.go:334] "Generic (PLEG): container finished" podID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerID="8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a" exitCode=0 Feb 01 08:04:30 crc kubenswrapper[5127]: I0201 08:04:30.128913 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-777tc" event={"ID":"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4","Type":"ContainerDied","Data":"8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a"} Feb 01 08:04:30 crc kubenswrapper[5127]: I0201 08:04:30.132649 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cknvj" event={"ID":"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e","Type":"ContainerStarted","Data":"b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666"} Feb 01 08:04:30 crc kubenswrapper[5127]: I0201 08:04:30.177117 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cknvj" podStartSLOduration=2.741805604 podStartE2EDuration="6.177088855s" podCreationTimestamp="2026-02-01 08:04:24 +0000 UTC" firstStartedPulling="2026-02-01 08:04:26.080383582 +0000 UTC m=+4616.566285945" lastFinishedPulling="2026-02-01 08:04:29.515666803 +0000 UTC m=+4620.001569196" observedRunningTime="2026-02-01 08:04:30.175283506 +0000 UTC m=+4620.661185879" watchObservedRunningTime="2026-02-01 08:04:30.177088855 +0000 UTC m=+4620.662991258" Feb 01 08:04:30 crc kubenswrapper[5127]: I0201 08:04:30.241936 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:04:30 crc kubenswrapper[5127]: E0201 08:04:30.242375 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:04:31 crc kubenswrapper[5127]: I0201 08:04:31.142610 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-777tc" event={"ID":"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4","Type":"ContainerStarted","Data":"1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b"} Feb 01 08:04:31 crc kubenswrapper[5127]: I0201 08:04:31.172699 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-777tc" podStartSLOduration=2.682186061 podStartE2EDuration="5.172673736s" podCreationTimestamp="2026-02-01 08:04:26 +0000 UTC" firstStartedPulling="2026-02-01 08:04:28.097294948 +0000 UTC m=+4618.583197351" lastFinishedPulling="2026-02-01 08:04:30.587782663 +0000 UTC m=+4621.073685026" observedRunningTime="2026-02-01 08:04:31.168387512 +0000 UTC m=+4621.654289905" watchObservedRunningTime="2026-02-01 08:04:31.172673736 +0000 UTC m=+4621.658576129" Feb 01 08:04:34 crc kubenswrapper[5127]: I0201 08:04:34.854974 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:34 crc kubenswrapper[5127]: I0201 08:04:34.855329 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:35 crc kubenswrapper[5127]: I0201 08:04:35.914870 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cknvj" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="registry-server" probeResult="failure" output=< Feb 01 08:04:35 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 08:04:35 crc kubenswrapper[5127]: > Feb 01 08:04:37 crc kubenswrapper[5127]: I0201 08:04:37.241304 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:37 crc kubenswrapper[5127]: I0201 08:04:37.241363 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:37 crc kubenswrapper[5127]: I0201 08:04:37.319704 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:38 crc kubenswrapper[5127]: I0201 08:04:38.244095 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:38 crc kubenswrapper[5127]: I0201 08:04:38.305979 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-777tc"] Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.219223 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-777tc" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="registry-server" containerID="cri-o://1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b" gracePeriod=2 Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.684932 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.863513 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-utilities\") pod \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.863840 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzb57\" (UniqueName: \"kubernetes.io/projected/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-kube-api-access-gzb57\") pod \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.863896 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-catalog-content\") pod \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\" (UID: \"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4\") " Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.866999 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-utilities" (OuterVolumeSpecName: "utilities") pod "3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" (UID: "3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.885091 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-kube-api-access-gzb57" (OuterVolumeSpecName: "kube-api-access-gzb57") pod "3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" (UID: "3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4"). InnerVolumeSpecName "kube-api-access-gzb57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.931236 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" (UID: "3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.966738 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzb57\" (UniqueName: \"kubernetes.io/projected/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-kube-api-access-gzb57\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.966783 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:40 crc kubenswrapper[5127]: I0201 08:04:40.966799 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.230802 5127 generic.go:334] "Generic (PLEG): container finished" podID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerID="1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b" exitCode=0 Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.230854 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-777tc" event={"ID":"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4","Type":"ContainerDied","Data":"1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b"} Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.230909 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-777tc" event={"ID":"3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4","Type":"ContainerDied","Data":"b4e8bf8e14bb281f8f0632a3207bb96bc4a01d54ed8957d27bacc362e2259c17"} Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.230928 5127 scope.go:117] "RemoveContainer" containerID="1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.230924 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-777tc" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.235010 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:04:41 crc kubenswrapper[5127]: E0201 08:04:41.235297 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.258850 5127 scope.go:117] "RemoveContainer" containerID="8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.287148 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-777tc"] Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.293770 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-777tc"] Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.303819 5127 scope.go:117] "RemoveContainer" containerID="fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.323741 5127 scope.go:117] "RemoveContainer" containerID="1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b" Feb 01 08:04:41 crc kubenswrapper[5127]: E0201 08:04:41.324237 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b\": container with ID starting with 1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b not found: ID does not exist" containerID="1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.324299 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b"} err="failed to get container status \"1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b\": rpc error: code = NotFound desc = could not find container \"1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b\": container with ID starting with 1546001a66115c53e08e3a7c2b22ef5739626c5b7fadad3ab6a6fd70d32ec74b not found: ID does not exist" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.324335 5127 scope.go:117] "RemoveContainer" containerID="8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a" Feb 01 08:04:41 crc kubenswrapper[5127]: E0201 08:04:41.324634 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a\": container with ID starting with 8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a not found: ID does not exist" containerID="8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.324653 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a"} err="failed to get container status \"8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a\": rpc error: code = NotFound desc = could not find container \"8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a\": container with ID starting with 8f27c903a307e735938ba7ac952255c6f644e51b3731883578e4244fad01567a not found: ID does not exist" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.324685 5127 scope.go:117] "RemoveContainer" containerID="fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20" Feb 01 08:04:41 crc kubenswrapper[5127]: E0201 08:04:41.325009 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20\": container with ID starting with fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20 not found: ID does not exist" containerID="fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20" Feb 01 08:04:41 crc kubenswrapper[5127]: I0201 08:04:41.325091 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20"} err="failed to get container status \"fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20\": rpc error: code = NotFound desc = could not find container \"fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20\": container with ID starting with fed217e2af3a5c072977ff772f092ee2bef298d721e673872185145c2f5ebb20 not found: ID does not exist" Feb 01 08:04:42 crc kubenswrapper[5127]: I0201 08:04:42.245785 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" path="/var/lib/kubelet/pods/3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4/volumes" Feb 01 08:04:44 crc kubenswrapper[5127]: I0201 08:04:44.929229 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:45 crc kubenswrapper[5127]: I0201 08:04:45.017498 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:45 crc kubenswrapper[5127]: I0201 08:04:45.173279 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cknvj"] Feb 01 08:04:46 crc kubenswrapper[5127]: I0201 08:04:46.278345 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cknvj" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="registry-server" containerID="cri-o://b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666" gracePeriod=2 Feb 01 08:04:46 crc kubenswrapper[5127]: I0201 08:04:46.781552 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:46 crc kubenswrapper[5127]: I0201 08:04:46.873542 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-catalog-content\") pod \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " Feb 01 08:04:46 crc kubenswrapper[5127]: I0201 08:04:46.979374 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-utilities\") pod \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " Feb 01 08:04:46 crc kubenswrapper[5127]: I0201 08:04:46.979491 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5snt2\" (UniqueName: \"kubernetes.io/projected/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-kube-api-access-5snt2\") pod \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\" (UID: \"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e\") " Feb 01 08:04:46 crc kubenswrapper[5127]: I0201 08:04:46.980764 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-utilities" (OuterVolumeSpecName: "utilities") pod "65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" (UID: "65510a8c-3e0d-40ba-8fe0-83c74eb36b0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.044761 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" (UID: "65510a8c-3e0d-40ba-8fe0-83c74eb36b0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.080669 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.080698 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.280349 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-kube-api-access-5snt2" (OuterVolumeSpecName: "kube-api-access-5snt2") pod "65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" (UID: "65510a8c-3e0d-40ba-8fe0-83c74eb36b0e"). InnerVolumeSpecName "kube-api-access-5snt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.284999 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5snt2\" (UniqueName: \"kubernetes.io/projected/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e-kube-api-access-5snt2\") on node \"crc\" DevicePath \"\"" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.294382 5127 generic.go:334] "Generic (PLEG): container finished" podID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerID="b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666" exitCode=0 Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.294443 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cknvj" event={"ID":"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e","Type":"ContainerDied","Data":"b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666"} Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.294487 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cknvj" event={"ID":"65510a8c-3e0d-40ba-8fe0-83c74eb36b0e","Type":"ContainerDied","Data":"16381080ba54bd5855fff779ed9b95a81a88ee084086025ddf30e761c010198e"} Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.294519 5127 scope.go:117] "RemoveContainer" containerID="b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.294516 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cknvj" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.358847 5127 scope.go:117] "RemoveContainer" containerID="4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.368048 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cknvj"] Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.382471 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cknvj"] Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.388198 5127 scope.go:117] "RemoveContainer" containerID="989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.526423 5127 scope.go:117] "RemoveContainer" containerID="b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666" Feb 01 08:04:47 crc kubenswrapper[5127]: E0201 08:04:47.527012 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666\": container with ID starting with b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666 not found: ID does not exist" containerID="b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.527059 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666"} err="failed to get container status \"b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666\": rpc error: code = NotFound desc = could not find container \"b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666\": container with ID starting with b534057d0c7b58221d3f1464ede7744c09db2964496d481b04f5c7273dba2666 not found: ID does not exist" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.527086 5127 scope.go:117] "RemoveContainer" containerID="4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b" Feb 01 08:04:47 crc kubenswrapper[5127]: E0201 08:04:47.527506 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b\": container with ID starting with 4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b not found: ID does not exist" containerID="4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.527531 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b"} err="failed to get container status \"4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b\": rpc error: code = NotFound desc = could not find container \"4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b\": container with ID starting with 4d21776577ab885a75dbd3ba8a092113149c318d4f4c15aaf7d58513d445215b not found: ID does not exist" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.527547 5127 scope.go:117] "RemoveContainer" containerID="989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6" Feb 01 08:04:47 crc kubenswrapper[5127]: E0201 08:04:47.527989 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6\": container with ID starting with 989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6 not found: ID does not exist" containerID="989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6" Feb 01 08:04:47 crc kubenswrapper[5127]: I0201 08:04:47.528015 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6"} err="failed to get container status \"989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6\": rpc error: code = NotFound desc = could not find container \"989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6\": container with ID starting with 989c83b953192cbdfdb2c4ca55f6bea6de9172b97059d8133bbbafa9044a82a6 not found: ID does not exist" Feb 01 08:04:48 crc kubenswrapper[5127]: I0201 08:04:48.249523 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" path="/var/lib/kubelet/pods/65510a8c-3e0d-40ba-8fe0-83c74eb36b0e/volumes" Feb 01 08:04:55 crc kubenswrapper[5127]: I0201 08:04:55.235674 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:04:55 crc kubenswrapper[5127]: E0201 08:04:55.237008 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:05:06 crc kubenswrapper[5127]: I0201 08:05:06.235867 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:05:06 crc kubenswrapper[5127]: E0201 08:05:06.236945 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:05:20 crc kubenswrapper[5127]: I0201 08:05:20.242853 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:05:20 crc kubenswrapper[5127]: E0201 08:05:20.243897 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:05:32 crc kubenswrapper[5127]: I0201 08:05:32.235863 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:05:32 crc kubenswrapper[5127]: E0201 08:05:32.237023 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:05:44 crc kubenswrapper[5127]: I0201 08:05:44.237292 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:05:44 crc kubenswrapper[5127]: E0201 08:05:44.238184 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:05:59 crc kubenswrapper[5127]: I0201 08:05:59.236333 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:05:59 crc kubenswrapper[5127]: E0201 08:05:59.237516 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:06:10 crc kubenswrapper[5127]: I0201 08:06:10.244097 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:06:10 crc kubenswrapper[5127]: E0201 08:06:10.245287 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:06:24 crc kubenswrapper[5127]: I0201 08:06:24.235664 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:06:24 crc kubenswrapper[5127]: E0201 08:06:24.236809 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:06:37 crc kubenswrapper[5127]: I0201 08:06:37.235950 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:06:37 crc kubenswrapper[5127]: E0201 08:06:37.236608 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:06:50 crc kubenswrapper[5127]: I0201 08:06:50.249175 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:06:50 crc kubenswrapper[5127]: E0201 08:06:50.250678 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:07:04 crc kubenswrapper[5127]: I0201 08:07:04.235881 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:07:04 crc kubenswrapper[5127]: E0201 08:07:04.236715 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:07:17 crc kubenswrapper[5127]: I0201 08:07:17.235861 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:07:17 crc kubenswrapper[5127]: E0201 08:07:17.236879 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:07:32 crc kubenswrapper[5127]: I0201 08:07:32.235608 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:07:32 crc kubenswrapper[5127]: E0201 08:07:32.236543 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:07:47 crc kubenswrapper[5127]: I0201 08:07:47.236340 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:07:47 crc kubenswrapper[5127]: E0201 08:07:47.238294 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:07:59 crc kubenswrapper[5127]: I0201 08:07:59.236378 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:07:59 crc kubenswrapper[5127]: E0201 08:07:59.237535 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:08:11 crc kubenswrapper[5127]: I0201 08:08:11.235649 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:08:11 crc kubenswrapper[5127]: E0201 08:08:11.236471 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:08:23 crc kubenswrapper[5127]: I0201 08:08:23.235038 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:08:23 crc kubenswrapper[5127]: E0201 08:08:23.236852 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:08:38 crc kubenswrapper[5127]: I0201 08:08:38.235109 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:08:38 crc kubenswrapper[5127]: I0201 08:08:38.462261 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"80c3dedcd20f2fd11c61e3372bfe8c05e440e77f74629f5f0fcaa8ee398bcfb2"} Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.750290 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x5wtd"] Feb 01 08:08:44 crc kubenswrapper[5127]: E0201 08:08:44.751331 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="extract-utilities" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751353 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="extract-utilities" Feb 01 08:08:44 crc kubenswrapper[5127]: E0201 08:08:44.751377 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="registry-server" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751390 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="registry-server" Feb 01 08:08:44 crc kubenswrapper[5127]: E0201 08:08:44.751431 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="extract-utilities" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751445 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="extract-utilities" Feb 01 08:08:44 crc kubenswrapper[5127]: E0201 08:08:44.751478 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="registry-server" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751492 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="registry-server" Feb 01 08:08:44 crc kubenswrapper[5127]: E0201 08:08:44.751516 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="extract-content" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751529 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="extract-content" Feb 01 08:08:44 crc kubenswrapper[5127]: E0201 08:08:44.751560 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="extract-content" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751577 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="extract-content" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751901 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="65510a8c-3e0d-40ba-8fe0-83c74eb36b0e" containerName="registry-server" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.751965 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e77e4ea-3dd5-4f3e-8d3c-ae8ae4f98ce4" containerName="registry-server" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.753714 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.770098 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5wtd"] Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.880510 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cgt\" (UniqueName: \"kubernetes.io/projected/25d2ee42-2393-4921-be07-b54cca60bfd9-kube-api-access-85cgt\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.880729 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-utilities\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.880816 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-catalog-content\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.982453 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-utilities\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.982619 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-catalog-content\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.982697 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cgt\" (UniqueName: \"kubernetes.io/projected/25d2ee42-2393-4921-be07-b54cca60bfd9-kube-api-access-85cgt\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.982951 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-utilities\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:44 crc kubenswrapper[5127]: I0201 08:08:44.983435 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-catalog-content\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:45 crc kubenswrapper[5127]: I0201 08:08:45.014767 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cgt\" (UniqueName: \"kubernetes.io/projected/25d2ee42-2393-4921-be07-b54cca60bfd9-kube-api-access-85cgt\") pod \"redhat-marketplace-x5wtd\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:45 crc kubenswrapper[5127]: I0201 08:08:45.071012 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:45 crc kubenswrapper[5127]: I0201 08:08:45.354294 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5wtd"] Feb 01 08:08:45 crc kubenswrapper[5127]: I0201 08:08:45.549562 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerStarted","Data":"ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b"} Feb 01 08:08:45 crc kubenswrapper[5127]: I0201 08:08:45.549632 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerStarted","Data":"3e2fda78acf1f1bce5ba35aa463b3499d55be8264467068e673ae2bd3e4157e2"} Feb 01 08:08:46 crc kubenswrapper[5127]: I0201 08:08:46.561141 5127 generic.go:334] "Generic (PLEG): container finished" podID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerID="ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b" exitCode=0 Feb 01 08:08:46 crc kubenswrapper[5127]: I0201 08:08:46.561295 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerDied","Data":"ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b"} Feb 01 08:08:47 crc kubenswrapper[5127]: I0201 08:08:47.573078 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerStarted","Data":"6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904"} Feb 01 08:08:48 crc kubenswrapper[5127]: I0201 08:08:48.587034 5127 generic.go:334] "Generic (PLEG): container finished" podID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerID="6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904" exitCode=0 Feb 01 08:08:48 crc kubenswrapper[5127]: I0201 08:08:48.587096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerDied","Data":"6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904"} Feb 01 08:08:49 crc kubenswrapper[5127]: I0201 08:08:49.599482 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerStarted","Data":"ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7"} Feb 01 08:08:49 crc kubenswrapper[5127]: I0201 08:08:49.630162 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x5wtd" podStartSLOduration=3.196457416 podStartE2EDuration="5.630131092s" podCreationTimestamp="2026-02-01 08:08:44 +0000 UTC" firstStartedPulling="2026-02-01 08:08:46.575305396 +0000 UTC m=+4877.061207809" lastFinishedPulling="2026-02-01 08:08:49.008979112 +0000 UTC m=+4879.494881485" observedRunningTime="2026-02-01 08:08:49.627971634 +0000 UTC m=+4880.113874007" watchObservedRunningTime="2026-02-01 08:08:49.630131092 +0000 UTC m=+4880.116033495" Feb 01 08:08:55 crc kubenswrapper[5127]: I0201 08:08:55.071606 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:55 crc kubenswrapper[5127]: I0201 08:08:55.073872 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:55 crc kubenswrapper[5127]: I0201 08:08:55.150757 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:55 crc kubenswrapper[5127]: I0201 08:08:55.703484 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:58 crc kubenswrapper[5127]: I0201 08:08:58.609718 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5wtd"] Feb 01 08:08:58 crc kubenswrapper[5127]: I0201 08:08:58.677113 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x5wtd" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="registry-server" containerID="cri-o://ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7" gracePeriod=2 Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.148480 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.226458 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85cgt\" (UniqueName: \"kubernetes.io/projected/25d2ee42-2393-4921-be07-b54cca60bfd9-kube-api-access-85cgt\") pod \"25d2ee42-2393-4921-be07-b54cca60bfd9\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.226651 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-utilities\") pod \"25d2ee42-2393-4921-be07-b54cca60bfd9\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.226724 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-catalog-content\") pod \"25d2ee42-2393-4921-be07-b54cca60bfd9\" (UID: \"25d2ee42-2393-4921-be07-b54cca60bfd9\") " Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.227965 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-utilities" (OuterVolumeSpecName: "utilities") pod "25d2ee42-2393-4921-be07-b54cca60bfd9" (UID: "25d2ee42-2393-4921-be07-b54cca60bfd9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.234743 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25d2ee42-2393-4921-be07-b54cca60bfd9-kube-api-access-85cgt" (OuterVolumeSpecName: "kube-api-access-85cgt") pod "25d2ee42-2393-4921-be07-b54cca60bfd9" (UID: "25d2ee42-2393-4921-be07-b54cca60bfd9"). InnerVolumeSpecName "kube-api-access-85cgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.262104 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25d2ee42-2393-4921-be07-b54cca60bfd9" (UID: "25d2ee42-2393-4921-be07-b54cca60bfd9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.328382 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.328438 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25d2ee42-2393-4921-be07-b54cca60bfd9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.328460 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85cgt\" (UniqueName: \"kubernetes.io/projected/25d2ee42-2393-4921-be07-b54cca60bfd9-kube-api-access-85cgt\") on node \"crc\" DevicePath \"\"" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.688522 5127 generic.go:334] "Generic (PLEG): container finished" podID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerID="ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7" exitCode=0 Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.688607 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerDied","Data":"ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7"} Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.688680 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x5wtd" event={"ID":"25d2ee42-2393-4921-be07-b54cca60bfd9","Type":"ContainerDied","Data":"3e2fda78acf1f1bce5ba35aa463b3499d55be8264467068e673ae2bd3e4157e2"} Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.688713 5127 scope.go:117] "RemoveContainer" containerID="ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.688727 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x5wtd" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.708273 5127 scope.go:117] "RemoveContainer" containerID="6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.727741 5127 scope.go:117] "RemoveContainer" containerID="ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.785947 5127 scope.go:117] "RemoveContainer" containerID="ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7" Feb 01 08:08:59 crc kubenswrapper[5127]: E0201 08:08:59.791519 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7\": container with ID starting with ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7 not found: ID does not exist" containerID="ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.791625 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7"} err="failed to get container status \"ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7\": rpc error: code = NotFound desc = could not find container \"ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7\": container with ID starting with ce8dd3fdd816978e4dec520ba70b6dcbd2c3097e25a65e4223ff6f70159634c7 not found: ID does not exist" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.791681 5127 scope.go:117] "RemoveContainer" containerID="6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904" Feb 01 08:08:59 crc kubenswrapper[5127]: E0201 08:08:59.793606 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904\": container with ID starting with 6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904 not found: ID does not exist" containerID="6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.793677 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904"} err="failed to get container status \"6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904\": rpc error: code = NotFound desc = could not find container \"6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904\": container with ID starting with 6b1e9fd0cd0076eb436077175c5d76803107ac3263bd1f2b32393a4e8d588904 not found: ID does not exist" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.793721 5127 scope.go:117] "RemoveContainer" containerID="ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b" Feb 01 08:08:59 crc kubenswrapper[5127]: E0201 08:08:59.794139 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b\": container with ID starting with ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b not found: ID does not exist" containerID="ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.794184 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b"} err="failed to get container status \"ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b\": rpc error: code = NotFound desc = could not find container \"ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b\": container with ID starting with ddf484430b04ea77f6a5d4a53aeb6d74031e891b67240c5fedacfdbaf94bb42b not found: ID does not exist" Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.794990 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5wtd"] Feb 01 08:08:59 crc kubenswrapper[5127]: I0201 08:08:59.802918 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x5wtd"] Feb 01 08:09:00 crc kubenswrapper[5127]: I0201 08:09:00.244791 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" path="/var/lib/kubelet/pods/25d2ee42-2393-4921-be07-b54cca60bfd9/volumes" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.480920 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7sgzz"] Feb 01 08:09:37 crc kubenswrapper[5127]: E0201 08:09:37.481804 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="extract-content" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.481819 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="extract-content" Feb 01 08:09:37 crc kubenswrapper[5127]: E0201 08:09:37.481833 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="extract-utilities" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.481842 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="extract-utilities" Feb 01 08:09:37 crc kubenswrapper[5127]: E0201 08:09:37.481854 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="registry-server" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.481862 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="registry-server" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.482018 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="25d2ee42-2393-4921-be07-b54cca60bfd9" containerName="registry-server" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.483700 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.497986 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7sgzz"] Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.548706 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-catalog-content\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.548772 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-utilities\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.548834 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr8n8\" (UniqueName: \"kubernetes.io/projected/7f252f82-10e9-44a0-80ee-d34540e63b80-kube-api-access-nr8n8\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.650286 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-catalog-content\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.650340 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-utilities\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.650376 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr8n8\" (UniqueName: \"kubernetes.io/projected/7f252f82-10e9-44a0-80ee-d34540e63b80-kube-api-access-nr8n8\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.651211 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-utilities\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.651229 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-catalog-content\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.670029 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr8n8\" (UniqueName: \"kubernetes.io/projected/7f252f82-10e9-44a0-80ee-d34540e63b80-kube-api-access-nr8n8\") pod \"certified-operators-7sgzz\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:37 crc kubenswrapper[5127]: I0201 08:09:37.856905 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:38 crc kubenswrapper[5127]: I0201 08:09:38.315513 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7sgzz"] Feb 01 08:09:39 crc kubenswrapper[5127]: I0201 08:09:39.020862 5127 generic.go:334] "Generic (PLEG): container finished" podID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerID="428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f" exitCode=0 Feb 01 08:09:39 crc kubenswrapper[5127]: I0201 08:09:39.020915 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sgzz" event={"ID":"7f252f82-10e9-44a0-80ee-d34540e63b80","Type":"ContainerDied","Data":"428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f"} Feb 01 08:09:39 crc kubenswrapper[5127]: I0201 08:09:39.020950 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sgzz" event={"ID":"7f252f82-10e9-44a0-80ee-d34540e63b80","Type":"ContainerStarted","Data":"715e283c153fc94074a0657944858e21e5987b6b65088483382976a7ca406195"} Feb 01 08:09:39 crc kubenswrapper[5127]: I0201 08:09:39.022335 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:09:41 crc kubenswrapper[5127]: I0201 08:09:41.042547 5127 generic.go:334] "Generic (PLEG): container finished" podID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerID="9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9" exitCode=0 Feb 01 08:09:41 crc kubenswrapper[5127]: I0201 08:09:41.042651 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sgzz" event={"ID":"7f252f82-10e9-44a0-80ee-d34540e63b80","Type":"ContainerDied","Data":"9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9"} Feb 01 08:09:42 crc kubenswrapper[5127]: I0201 08:09:42.057473 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sgzz" event={"ID":"7f252f82-10e9-44a0-80ee-d34540e63b80","Type":"ContainerStarted","Data":"d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7"} Feb 01 08:09:42 crc kubenswrapper[5127]: I0201 08:09:42.085387 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7sgzz" podStartSLOduration=2.614714125 podStartE2EDuration="5.085354156s" podCreationTimestamp="2026-02-01 08:09:37 +0000 UTC" firstStartedPulling="2026-02-01 08:09:39.022113673 +0000 UTC m=+4929.508016036" lastFinishedPulling="2026-02-01 08:09:41.492753694 +0000 UTC m=+4931.978656067" observedRunningTime="2026-02-01 08:09:42.083444394 +0000 UTC m=+4932.569346787" watchObservedRunningTime="2026-02-01 08:09:42.085354156 +0000 UTC m=+4932.571256559" Feb 01 08:09:47 crc kubenswrapper[5127]: I0201 08:09:47.857480 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:47 crc kubenswrapper[5127]: I0201 08:09:47.858520 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:47 crc kubenswrapper[5127]: I0201 08:09:47.922174 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:48 crc kubenswrapper[5127]: I0201 08:09:48.200085 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:48 crc kubenswrapper[5127]: I0201 08:09:48.271705 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7sgzz"] Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.139201 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7sgzz" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="registry-server" containerID="cri-o://d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7" gracePeriod=2 Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.663312 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.767024 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-utilities\") pod \"7f252f82-10e9-44a0-80ee-d34540e63b80\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.768048 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-catalog-content\") pod \"7f252f82-10e9-44a0-80ee-d34540e63b80\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.768087 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nr8n8\" (UniqueName: \"kubernetes.io/projected/7f252f82-10e9-44a0-80ee-d34540e63b80-kube-api-access-nr8n8\") pod \"7f252f82-10e9-44a0-80ee-d34540e63b80\" (UID: \"7f252f82-10e9-44a0-80ee-d34540e63b80\") " Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.768000 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-utilities" (OuterVolumeSpecName: "utilities") pod "7f252f82-10e9-44a0-80ee-d34540e63b80" (UID: "7f252f82-10e9-44a0-80ee-d34540e63b80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.784500 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f252f82-10e9-44a0-80ee-d34540e63b80-kube-api-access-nr8n8" (OuterVolumeSpecName: "kube-api-access-nr8n8") pod "7f252f82-10e9-44a0-80ee-d34540e63b80" (UID: "7f252f82-10e9-44a0-80ee-d34540e63b80"). InnerVolumeSpecName "kube-api-access-nr8n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.834371 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f252f82-10e9-44a0-80ee-d34540e63b80" (UID: "7f252f82-10e9-44a0-80ee-d34540e63b80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.869208 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.869257 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f252f82-10e9-44a0-80ee-d34540e63b80-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:09:50 crc kubenswrapper[5127]: I0201 08:09:50.869270 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nr8n8\" (UniqueName: \"kubernetes.io/projected/7f252f82-10e9-44a0-80ee-d34540e63b80-kube-api-access-nr8n8\") on node \"crc\" DevicePath \"\"" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.152359 5127 generic.go:334] "Generic (PLEG): container finished" podID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerID="d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7" exitCode=0 Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.152432 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sgzz" event={"ID":"7f252f82-10e9-44a0-80ee-d34540e63b80","Type":"ContainerDied","Data":"d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7"} Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.152471 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7sgzz" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.152503 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7sgzz" event={"ID":"7f252f82-10e9-44a0-80ee-d34540e63b80","Type":"ContainerDied","Data":"715e283c153fc94074a0657944858e21e5987b6b65088483382976a7ca406195"} Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.152542 5127 scope.go:117] "RemoveContainer" containerID="d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.183198 5127 scope.go:117] "RemoveContainer" containerID="9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.226439 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7sgzz"] Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.226567 5127 scope.go:117] "RemoveContainer" containerID="428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.233269 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7sgzz"] Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.255818 5127 scope.go:117] "RemoveContainer" containerID="d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7" Feb 01 08:09:51 crc kubenswrapper[5127]: E0201 08:09:51.256678 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7\": container with ID starting with d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7 not found: ID does not exist" containerID="d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.256753 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7"} err="failed to get container status \"d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7\": rpc error: code = NotFound desc = could not find container \"d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7\": container with ID starting with d11b0ddc146353d51a1106f7d4d7d27bff13aad133a82bf78e7a9c732087c6b7 not found: ID does not exist" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.256793 5127 scope.go:117] "RemoveContainer" containerID="9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9" Feb 01 08:09:51 crc kubenswrapper[5127]: E0201 08:09:51.257317 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9\": container with ID starting with 9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9 not found: ID does not exist" containerID="9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.257454 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9"} err="failed to get container status \"9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9\": rpc error: code = NotFound desc = could not find container \"9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9\": container with ID starting with 9354c2ecea777a8b3e83b336c33b72f9f052edc6af97a29259089fce3a969db9 not found: ID does not exist" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.257569 5127 scope.go:117] "RemoveContainer" containerID="428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f" Feb 01 08:09:51 crc kubenswrapper[5127]: E0201 08:09:51.258129 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f\": container with ID starting with 428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f not found: ID does not exist" containerID="428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f" Feb 01 08:09:51 crc kubenswrapper[5127]: I0201 08:09:51.258255 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f"} err="failed to get container status \"428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f\": rpc error: code = NotFound desc = could not find container \"428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f\": container with ID starting with 428c9324a4eb6765068c8f87eab8b42139bfa079d31e8f1c855ff7576cf3465f not found: ID does not exist" Feb 01 08:09:52 crc kubenswrapper[5127]: I0201 08:09:52.250150 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" path="/var/lib/kubelet/pods/7f252f82-10e9-44a0-80ee-d34540e63b80/volumes" Feb 01 08:11:06 crc kubenswrapper[5127]: I0201 08:11:06.741781 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:11:06 crc kubenswrapper[5127]: I0201 08:11:06.742421 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:11:36 crc kubenswrapper[5127]: I0201 08:11:36.740720 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:11:36 crc kubenswrapper[5127]: I0201 08:11:36.741391 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:12:06 crc kubenswrapper[5127]: I0201 08:12:06.740620 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:12:06 crc kubenswrapper[5127]: I0201 08:12:06.741125 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:12:06 crc kubenswrapper[5127]: I0201 08:12:06.741169 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:12:06 crc kubenswrapper[5127]: I0201 08:12:06.741935 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80c3dedcd20f2fd11c61e3372bfe8c05e440e77f74629f5f0fcaa8ee398bcfb2"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:12:06 crc kubenswrapper[5127]: I0201 08:12:06.741993 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://80c3dedcd20f2fd11c61e3372bfe8c05e440e77f74629f5f0fcaa8ee398bcfb2" gracePeriod=600 Feb 01 08:12:07 crc kubenswrapper[5127]: I0201 08:12:07.472550 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="80c3dedcd20f2fd11c61e3372bfe8c05e440e77f74629f5f0fcaa8ee398bcfb2" exitCode=0 Feb 01 08:12:07 crc kubenswrapper[5127]: I0201 08:12:07.472609 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"80c3dedcd20f2fd11c61e3372bfe8c05e440e77f74629f5f0fcaa8ee398bcfb2"} Feb 01 08:12:07 crc kubenswrapper[5127]: I0201 08:12:07.473094 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974"} Feb 01 08:12:07 crc kubenswrapper[5127]: I0201 08:12:07.473126 5127 scope.go:117] "RemoveContainer" containerID="d257ea3296ce95ecb1b38894c2447bb57ec2c1a58f25c8636ef38c0fd06969d4" Feb 01 08:14:36 crc kubenswrapper[5127]: I0201 08:14:36.741147 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:14:36 crc kubenswrapper[5127]: I0201 08:14:36.742022 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.558179 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6hbcp"] Feb 01 08:14:40 crc kubenswrapper[5127]: E0201 08:14:40.560567 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="extract-content" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.560747 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="extract-content" Feb 01 08:14:40 crc kubenswrapper[5127]: E0201 08:14:40.560870 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="registry-server" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.560981 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="registry-server" Feb 01 08:14:40 crc kubenswrapper[5127]: E0201 08:14:40.561098 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="extract-utilities" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.561221 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="extract-utilities" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.561574 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f252f82-10e9-44a0-80ee-d34540e63b80" containerName="registry-server" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.563253 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.581465 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6hbcp"] Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.741777 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk899\" (UniqueName: \"kubernetes.io/projected/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-kube-api-access-wk899\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.741922 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-utilities\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.742037 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-catalog-content\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.843461 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-catalog-content\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.843667 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk899\" (UniqueName: \"kubernetes.io/projected/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-kube-api-access-wk899\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.843785 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-utilities\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.844374 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-utilities\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.844383 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-catalog-content\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.870952 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk899\" (UniqueName: \"kubernetes.io/projected/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-kube-api-access-wk899\") pod \"community-operators-6hbcp\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:40 crc kubenswrapper[5127]: I0201 08:14:40.883663 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:41 crc kubenswrapper[5127]: I0201 08:14:41.403732 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6hbcp"] Feb 01 08:14:41 crc kubenswrapper[5127]: I0201 08:14:41.984442 5127 generic.go:334] "Generic (PLEG): container finished" podID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerID="f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886" exitCode=0 Feb 01 08:14:41 crc kubenswrapper[5127]: I0201 08:14:41.984682 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbcp" event={"ID":"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e","Type":"ContainerDied","Data":"f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886"} Feb 01 08:14:41 crc kubenswrapper[5127]: I0201 08:14:41.984778 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbcp" event={"ID":"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e","Type":"ContainerStarted","Data":"f93c4307c8ad7ff0e0f3931c5123a4803156dfe15034b4fb144ed1c3cfc413ea"} Feb 01 08:14:41 crc kubenswrapper[5127]: I0201 08:14:41.987563 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:14:42 crc kubenswrapper[5127]: I0201 08:14:42.993468 5127 generic.go:334] "Generic (PLEG): container finished" podID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerID="0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52" exitCode=0 Feb 01 08:14:42 crc kubenswrapper[5127]: I0201 08:14:42.993527 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbcp" event={"ID":"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e","Type":"ContainerDied","Data":"0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52"} Feb 01 08:14:44 crc kubenswrapper[5127]: I0201 08:14:44.005713 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbcp" event={"ID":"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e","Type":"ContainerStarted","Data":"13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf"} Feb 01 08:14:44 crc kubenswrapper[5127]: I0201 08:14:44.042299 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6hbcp" podStartSLOduration=2.646120712 podStartE2EDuration="4.042264655s" podCreationTimestamp="2026-02-01 08:14:40 +0000 UTC" firstStartedPulling="2026-02-01 08:14:41.987233397 +0000 UTC m=+5232.473135760" lastFinishedPulling="2026-02-01 08:14:43.38337733 +0000 UTC m=+5233.869279703" observedRunningTime="2026-02-01 08:14:44.032198015 +0000 UTC m=+5234.518100408" watchObservedRunningTime="2026-02-01 08:14:44.042264655 +0000 UTC m=+5234.528167058" Feb 01 08:14:50 crc kubenswrapper[5127]: I0201 08:14:50.884857 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:50 crc kubenswrapper[5127]: I0201 08:14:50.885559 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:50 crc kubenswrapper[5127]: I0201 08:14:50.966599 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:51 crc kubenswrapper[5127]: I0201 08:14:51.142821 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:51 crc kubenswrapper[5127]: I0201 08:14:51.224250 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6hbcp"] Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.095623 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6hbcp" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="registry-server" containerID="cri-o://13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf" gracePeriod=2 Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.558966 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.745790 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk899\" (UniqueName: \"kubernetes.io/projected/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-kube-api-access-wk899\") pod \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.745901 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-catalog-content\") pod \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.745978 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-utilities\") pod \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\" (UID: \"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e\") " Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.747245 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-utilities" (OuterVolumeSpecName: "utilities") pod "dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" (UID: "dd7ae8fd-6df4-4414-944b-7d1e3105ae7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.752144 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-kube-api-access-wk899" (OuterVolumeSpecName: "kube-api-access-wk899") pod "dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" (UID: "dd7ae8fd-6df4-4414-944b-7d1e3105ae7e"). InnerVolumeSpecName "kube-api-access-wk899". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.847915 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk899\" (UniqueName: \"kubernetes.io/projected/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-kube-api-access-wk899\") on node \"crc\" DevicePath \"\"" Feb 01 08:14:53 crc kubenswrapper[5127]: I0201 08:14:53.847959 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.106451 5127 generic.go:334] "Generic (PLEG): container finished" podID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerID="13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf" exitCode=0 Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.106528 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbcp" event={"ID":"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e","Type":"ContainerDied","Data":"13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf"} Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.106567 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6hbcp" event={"ID":"dd7ae8fd-6df4-4414-944b-7d1e3105ae7e","Type":"ContainerDied","Data":"f93c4307c8ad7ff0e0f3931c5123a4803156dfe15034b4fb144ed1c3cfc413ea"} Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.106604 5127 scope.go:117] "RemoveContainer" containerID="13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.106598 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6hbcp" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.130165 5127 scope.go:117] "RemoveContainer" containerID="0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.151198 5127 scope.go:117] "RemoveContainer" containerID="f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.179475 5127 scope.go:117] "RemoveContainer" containerID="13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf" Feb 01 08:14:54 crc kubenswrapper[5127]: E0201 08:14:54.180109 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf\": container with ID starting with 13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf not found: ID does not exist" containerID="13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.180177 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf"} err="failed to get container status \"13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf\": rpc error: code = NotFound desc = could not find container \"13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf\": container with ID starting with 13a00ac0d2ffe64cd90739ae5ed368cce9ad366973d745aeb4232b4901200faf not found: ID does not exist" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.180221 5127 scope.go:117] "RemoveContainer" containerID="0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52" Feb 01 08:14:54 crc kubenswrapper[5127]: E0201 08:14:54.180612 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52\": container with ID starting with 0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52 not found: ID does not exist" containerID="0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.180660 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52"} err="failed to get container status \"0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52\": rpc error: code = NotFound desc = could not find container \"0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52\": container with ID starting with 0bfa744763557a938222e225b220637fd799eef6af190065b7b5c8c0a701fd52 not found: ID does not exist" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.180692 5127 scope.go:117] "RemoveContainer" containerID="f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886" Feb 01 08:14:54 crc kubenswrapper[5127]: E0201 08:14:54.180934 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886\": container with ID starting with f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886 not found: ID does not exist" containerID="f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.180967 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886"} err="failed to get container status \"f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886\": rpc error: code = NotFound desc = could not find container \"f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886\": container with ID starting with f8bf342c2b61fd7f0c522562db12ad86dd74982db32ed0aeb01a3ec24aa20886 not found: ID does not exist" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.207756 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" (UID: "dd7ae8fd-6df4-4414-944b-7d1e3105ae7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.252266 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.434814 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6hbcp"] Feb 01 08:14:54 crc kubenswrapper[5127]: I0201 08:14:54.441086 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6hbcp"] Feb 01 08:14:56 crc kubenswrapper[5127]: I0201 08:14:56.249058 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" path="/var/lib/kubelet/pods/dd7ae8fd-6df4-4414-944b-7d1e3105ae7e/volumes" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.174829 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4"] Feb 01 08:15:00 crc kubenswrapper[5127]: E0201 08:15:00.175387 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="extract-content" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.175400 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="extract-content" Feb 01 08:15:00 crc kubenswrapper[5127]: E0201 08:15:00.175415 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="extract-utilities" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.175421 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="extract-utilities" Feb 01 08:15:00 crc kubenswrapper[5127]: E0201 08:15:00.175451 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="registry-server" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.175457 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="registry-server" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.175612 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7ae8fd-6df4-4414-944b-7d1e3105ae7e" containerName="registry-server" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.176148 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.178367 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.178534 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.181722 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f741ee-d517-4882-9f2d-c940b1dfa333-secret-volume\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.181930 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f741ee-d517-4882-9f2d-c940b1dfa333-config-volume\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.182036 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fg9f\" (UniqueName: \"kubernetes.io/projected/73f741ee-d517-4882-9f2d-c940b1dfa333-kube-api-access-7fg9f\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.198912 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4"] Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.282930 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f741ee-d517-4882-9f2d-c940b1dfa333-secret-volume\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.282978 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f741ee-d517-4882-9f2d-c940b1dfa333-config-volume\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.283016 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fg9f\" (UniqueName: \"kubernetes.io/projected/73f741ee-d517-4882-9f2d-c940b1dfa333-kube-api-access-7fg9f\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.284100 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f741ee-d517-4882-9f2d-c940b1dfa333-config-volume\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.300402 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f741ee-d517-4882-9f2d-c940b1dfa333-secret-volume\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.302756 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fg9f\" (UniqueName: \"kubernetes.io/projected/73f741ee-d517-4882-9f2d-c940b1dfa333-kube-api-access-7fg9f\") pod \"collect-profiles-29498895-xrkj4\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.490921 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:00 crc kubenswrapper[5127]: I0201 08:15:00.822175 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4"] Feb 01 08:15:01 crc kubenswrapper[5127]: I0201 08:15:01.153574 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" event={"ID":"73f741ee-d517-4882-9f2d-c940b1dfa333","Type":"ContainerStarted","Data":"983bbad0d04e7290731fcff2ddd90c166ca9e264f3ee9e8403d19c886a59675f"} Feb 01 08:15:01 crc kubenswrapper[5127]: I0201 08:15:01.153662 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" event={"ID":"73f741ee-d517-4882-9f2d-c940b1dfa333","Type":"ContainerStarted","Data":"cb77effee8397160cf32344179cc1db85c9226f8b8c8ff74f7162ce55bb8b600"} Feb 01 08:15:02 crc kubenswrapper[5127]: I0201 08:15:02.162807 5127 generic.go:334] "Generic (PLEG): container finished" podID="73f741ee-d517-4882-9f2d-c940b1dfa333" containerID="983bbad0d04e7290731fcff2ddd90c166ca9e264f3ee9e8403d19c886a59675f" exitCode=0 Feb 01 08:15:02 crc kubenswrapper[5127]: I0201 08:15:02.162848 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" event={"ID":"73f741ee-d517-4882-9f2d-c940b1dfa333","Type":"ContainerDied","Data":"983bbad0d04e7290731fcff2ddd90c166ca9e264f3ee9e8403d19c886a59675f"} Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.439476 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.630552 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fg9f\" (UniqueName: \"kubernetes.io/projected/73f741ee-d517-4882-9f2d-c940b1dfa333-kube-api-access-7fg9f\") pod \"73f741ee-d517-4882-9f2d-c940b1dfa333\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.630656 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f741ee-d517-4882-9f2d-c940b1dfa333-config-volume\") pod \"73f741ee-d517-4882-9f2d-c940b1dfa333\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.630785 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f741ee-d517-4882-9f2d-c940b1dfa333-secret-volume\") pod \"73f741ee-d517-4882-9f2d-c940b1dfa333\" (UID: \"73f741ee-d517-4882-9f2d-c940b1dfa333\") " Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.631916 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73f741ee-d517-4882-9f2d-c940b1dfa333-config-volume" (OuterVolumeSpecName: "config-volume") pod "73f741ee-d517-4882-9f2d-c940b1dfa333" (UID: "73f741ee-d517-4882-9f2d-c940b1dfa333"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.639212 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f741ee-d517-4882-9f2d-c940b1dfa333-kube-api-access-7fg9f" (OuterVolumeSpecName: "kube-api-access-7fg9f") pod "73f741ee-d517-4882-9f2d-c940b1dfa333" (UID: "73f741ee-d517-4882-9f2d-c940b1dfa333"). InnerVolumeSpecName "kube-api-access-7fg9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.647383 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f741ee-d517-4882-9f2d-c940b1dfa333-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73f741ee-d517-4882-9f2d-c940b1dfa333" (UID: "73f741ee-d517-4882-9f2d-c940b1dfa333"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.734380 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fg9f\" (UniqueName: \"kubernetes.io/projected/73f741ee-d517-4882-9f2d-c940b1dfa333-kube-api-access-7fg9f\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.734440 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73f741ee-d517-4882-9f2d-c940b1dfa333-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:03 crc kubenswrapper[5127]: I0201 08:15:03.734512 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73f741ee-d517-4882-9f2d-c940b1dfa333-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:15:04 crc kubenswrapper[5127]: I0201 08:15:04.180368 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" event={"ID":"73f741ee-d517-4882-9f2d-c940b1dfa333","Type":"ContainerDied","Data":"cb77effee8397160cf32344179cc1db85c9226f8b8c8ff74f7162ce55bb8b600"} Feb 01 08:15:04 crc kubenswrapper[5127]: I0201 08:15:04.180415 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb77effee8397160cf32344179cc1db85c9226f8b8c8ff74f7162ce55bb8b600" Feb 01 08:15:04 crc kubenswrapper[5127]: I0201 08:15:04.180466 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4" Feb 01 08:15:04 crc kubenswrapper[5127]: I0201 08:15:04.263210 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v"] Feb 01 08:15:04 crc kubenswrapper[5127]: I0201 08:15:04.272307 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498850-scw7v"] Feb 01 08:15:06 crc kubenswrapper[5127]: I0201 08:15:06.257305 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7dee2c-f497-4ddf-89ef-19b5f937965b" path="/var/lib/kubelet/pods/7d7dee2c-f497-4ddf-89ef-19b5f937965b/volumes" Feb 01 08:15:06 crc kubenswrapper[5127]: I0201 08:15:06.741274 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:15:06 crc kubenswrapper[5127]: I0201 08:15:06.741359 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:15:36 crc kubenswrapper[5127]: I0201 08:15:36.740807 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:15:36 crc kubenswrapper[5127]: I0201 08:15:36.741632 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:15:36 crc kubenswrapper[5127]: I0201 08:15:36.741699 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:15:36 crc kubenswrapper[5127]: I0201 08:15:36.742566 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:15:36 crc kubenswrapper[5127]: I0201 08:15:36.742703 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" gracePeriod=600 Feb 01 08:15:36 crc kubenswrapper[5127]: E0201 08:15:36.897056 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:15:37 crc kubenswrapper[5127]: I0201 08:15:37.496020 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" exitCode=0 Feb 01 08:15:37 crc kubenswrapper[5127]: I0201 08:15:37.496076 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974"} Feb 01 08:15:37 crc kubenswrapper[5127]: I0201 08:15:37.496165 5127 scope.go:117] "RemoveContainer" containerID="80c3dedcd20f2fd11c61e3372bfe8c05e440e77f74629f5f0fcaa8ee398bcfb2" Feb 01 08:15:37 crc kubenswrapper[5127]: I0201 08:15:37.497212 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:15:37 crc kubenswrapper[5127]: E0201 08:15:37.497767 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:15:41 crc kubenswrapper[5127]: I0201 08:15:41.497530 5127 scope.go:117] "RemoveContainer" containerID="c66b1c6cc7ad86059ba5529ee83a63ed8099ff594661045d34942aa7997b3242" Feb 01 08:15:50 crc kubenswrapper[5127]: I0201 08:15:50.237060 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:15:50 crc kubenswrapper[5127]: E0201 08:15:50.238387 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:16:03 crc kubenswrapper[5127]: I0201 08:16:03.235883 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:16:03 crc kubenswrapper[5127]: E0201 08:16:03.237837 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:16:17 crc kubenswrapper[5127]: I0201 08:16:17.236553 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:16:17 crc kubenswrapper[5127]: E0201 08:16:17.237866 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:16:32 crc kubenswrapper[5127]: I0201 08:16:32.236244 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:16:32 crc kubenswrapper[5127]: E0201 08:16:32.237698 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:16:45 crc kubenswrapper[5127]: I0201 08:16:45.236519 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:16:45 crc kubenswrapper[5127]: E0201 08:16:45.239720 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:16:57 crc kubenswrapper[5127]: I0201 08:16:57.236250 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:16:57 crc kubenswrapper[5127]: E0201 08:16:57.237495 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:17:08 crc kubenswrapper[5127]: I0201 08:17:08.236185 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:17:08 crc kubenswrapper[5127]: E0201 08:17:08.237175 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:17:22 crc kubenswrapper[5127]: I0201 08:17:22.236430 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:17:22 crc kubenswrapper[5127]: E0201 08:17:22.237719 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:17:36 crc kubenswrapper[5127]: I0201 08:17:36.235925 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:17:36 crc kubenswrapper[5127]: E0201 08:17:36.238795 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:17:50 crc kubenswrapper[5127]: I0201 08:17:50.244765 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:17:50 crc kubenswrapper[5127]: E0201 08:17:50.245838 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:18:04 crc kubenswrapper[5127]: I0201 08:18:04.235821 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:18:04 crc kubenswrapper[5127]: E0201 08:18:04.237755 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:18:19 crc kubenswrapper[5127]: I0201 08:18:19.238085 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:18:19 crc kubenswrapper[5127]: E0201 08:18:19.239332 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:18:34 crc kubenswrapper[5127]: I0201 08:18:34.236878 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:18:34 crc kubenswrapper[5127]: E0201 08:18:34.237989 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:18:45 crc kubenswrapper[5127]: I0201 08:18:45.236167 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:18:45 crc kubenswrapper[5127]: E0201 08:18:45.237095 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:18:57 crc kubenswrapper[5127]: I0201 08:18:57.235961 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:18:57 crc kubenswrapper[5127]: E0201 08:18:57.236702 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:19:12 crc kubenswrapper[5127]: I0201 08:19:12.236636 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:19:12 crc kubenswrapper[5127]: E0201 08:19:12.237884 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:19:24 crc kubenswrapper[5127]: I0201 08:19:24.236453 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:19:24 crc kubenswrapper[5127]: E0201 08:19:24.237576 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:19:37 crc kubenswrapper[5127]: I0201 08:19:37.236119 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:19:37 crc kubenswrapper[5127]: E0201 08:19:37.236977 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:19:39 crc kubenswrapper[5127]: I0201 08:19:39.980272 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zrmhq"] Feb 01 08:19:39 crc kubenswrapper[5127]: E0201 08:19:39.981143 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f741ee-d517-4882-9f2d-c940b1dfa333" containerName="collect-profiles" Feb 01 08:19:39 crc kubenswrapper[5127]: I0201 08:19:39.981171 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f741ee-d517-4882-9f2d-c940b1dfa333" containerName="collect-profiles" Feb 01 08:19:39 crc kubenswrapper[5127]: I0201 08:19:39.981447 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f741ee-d517-4882-9f2d-c940b1dfa333" containerName="collect-profiles" Feb 01 08:19:39 crc kubenswrapper[5127]: I0201 08:19:39.983476 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.004190 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrmhq"] Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.040803 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntd76\" (UniqueName: \"kubernetes.io/projected/b399a6db-7119-4897-a382-d0d8b717024d-kube-api-access-ntd76\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.040882 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-catalog-content\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.040991 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-utilities\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.142920 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntd76\" (UniqueName: \"kubernetes.io/projected/b399a6db-7119-4897-a382-d0d8b717024d-kube-api-access-ntd76\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.142988 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-catalog-content\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.143054 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-utilities\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.143774 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-utilities\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.144004 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-catalog-content\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.171079 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntd76\" (UniqueName: \"kubernetes.io/projected/b399a6db-7119-4897-a382-d0d8b717024d-kube-api-access-ntd76\") pod \"redhat-marketplace-zrmhq\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.352141 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.606889 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrmhq"] Feb 01 08:19:40 crc kubenswrapper[5127]: I0201 08:19:40.783553 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrmhq" event={"ID":"b399a6db-7119-4897-a382-d0d8b717024d","Type":"ContainerStarted","Data":"a7a64972001c64af5496267af9d821155b0048e9f9304833708449aa98d8ceed"} Feb 01 08:19:41 crc kubenswrapper[5127]: I0201 08:19:41.796218 5127 generic.go:334] "Generic (PLEG): container finished" podID="b399a6db-7119-4897-a382-d0d8b717024d" containerID="2a993f4a8f603d653578bd1fd9b4b1bb2f68caca01e7b40c8b67f95ee15dd710" exitCode=0 Feb 01 08:19:41 crc kubenswrapper[5127]: I0201 08:19:41.796321 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrmhq" event={"ID":"b399a6db-7119-4897-a382-d0d8b717024d","Type":"ContainerDied","Data":"2a993f4a8f603d653578bd1fd9b4b1bb2f68caca01e7b40c8b67f95ee15dd710"} Feb 01 08:19:42 crc kubenswrapper[5127]: I0201 08:19:42.808660 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrmhq" event={"ID":"b399a6db-7119-4897-a382-d0d8b717024d","Type":"ContainerStarted","Data":"06da351c18534b12c1b1cf3e9d240ec9a71e7bc04d0486a25f0ba5a420905bb7"} Feb 01 08:19:43 crc kubenswrapper[5127]: I0201 08:19:43.821945 5127 generic.go:334] "Generic (PLEG): container finished" podID="b399a6db-7119-4897-a382-d0d8b717024d" containerID="06da351c18534b12c1b1cf3e9d240ec9a71e7bc04d0486a25f0ba5a420905bb7" exitCode=0 Feb 01 08:19:43 crc kubenswrapper[5127]: I0201 08:19:43.821997 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrmhq" event={"ID":"b399a6db-7119-4897-a382-d0d8b717024d","Type":"ContainerDied","Data":"06da351c18534b12c1b1cf3e9d240ec9a71e7bc04d0486a25f0ba5a420905bb7"} Feb 01 08:19:43 crc kubenswrapper[5127]: I0201 08:19:43.825125 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:19:44 crc kubenswrapper[5127]: I0201 08:19:44.844620 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrmhq" event={"ID":"b399a6db-7119-4897-a382-d0d8b717024d","Type":"ContainerStarted","Data":"f8e4ecf51a2377da8009406554a4c5247d0a7b7e50be6473aaa2c44cc1d9ffaf"} Feb 01 08:19:50 crc kubenswrapper[5127]: I0201 08:19:50.245166 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:19:50 crc kubenswrapper[5127]: E0201 08:19:50.245953 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:19:50 crc kubenswrapper[5127]: I0201 08:19:50.352432 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:50 crc kubenswrapper[5127]: I0201 08:19:50.352487 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:50 crc kubenswrapper[5127]: I0201 08:19:50.424384 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:50 crc kubenswrapper[5127]: I0201 08:19:50.440639 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zrmhq" podStartSLOduration=8.907102533 podStartE2EDuration="11.440620687s" podCreationTimestamp="2026-02-01 08:19:39 +0000 UTC" firstStartedPulling="2026-02-01 08:19:41.798800453 +0000 UTC m=+5532.284702846" lastFinishedPulling="2026-02-01 08:19:44.332318597 +0000 UTC m=+5534.818221000" observedRunningTime="2026-02-01 08:19:44.875257911 +0000 UTC m=+5535.361160274" watchObservedRunningTime="2026-02-01 08:19:50.440620687 +0000 UTC m=+5540.926523050" Feb 01 08:19:50 crc kubenswrapper[5127]: I0201 08:19:50.944157 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:50 crc kubenswrapper[5127]: I0201 08:19:50.998277 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrmhq"] Feb 01 08:19:52 crc kubenswrapper[5127]: I0201 08:19:52.912550 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zrmhq" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="registry-server" containerID="cri-o://f8e4ecf51a2377da8009406554a4c5247d0a7b7e50be6473aaa2c44cc1d9ffaf" gracePeriod=2 Feb 01 08:19:53 crc kubenswrapper[5127]: I0201 08:19:53.924711 5127 generic.go:334] "Generic (PLEG): container finished" podID="b399a6db-7119-4897-a382-d0d8b717024d" containerID="f8e4ecf51a2377da8009406554a4c5247d0a7b7e50be6473aaa2c44cc1d9ffaf" exitCode=0 Feb 01 08:19:53 crc kubenswrapper[5127]: I0201 08:19:53.924790 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrmhq" event={"ID":"b399a6db-7119-4897-a382-d0d8b717024d","Type":"ContainerDied","Data":"f8e4ecf51a2377da8009406554a4c5247d0a7b7e50be6473aaa2c44cc1d9ffaf"} Feb 01 08:19:53 crc kubenswrapper[5127]: I0201 08:19:53.981697 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.060995 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-catalog-content\") pod \"b399a6db-7119-4897-a382-d0d8b717024d\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.061115 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntd76\" (UniqueName: \"kubernetes.io/projected/b399a6db-7119-4897-a382-d0d8b717024d-kube-api-access-ntd76\") pod \"b399a6db-7119-4897-a382-d0d8b717024d\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.061234 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-utilities\") pod \"b399a6db-7119-4897-a382-d0d8b717024d\" (UID: \"b399a6db-7119-4897-a382-d0d8b717024d\") " Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.062563 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-utilities" (OuterVolumeSpecName: "utilities") pod "b399a6db-7119-4897-a382-d0d8b717024d" (UID: "b399a6db-7119-4897-a382-d0d8b717024d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.072061 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b399a6db-7119-4897-a382-d0d8b717024d-kube-api-access-ntd76" (OuterVolumeSpecName: "kube-api-access-ntd76") pod "b399a6db-7119-4897-a382-d0d8b717024d" (UID: "b399a6db-7119-4897-a382-d0d8b717024d"). InnerVolumeSpecName "kube-api-access-ntd76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.110630 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b399a6db-7119-4897-a382-d0d8b717024d" (UID: "b399a6db-7119-4897-a382-d0d8b717024d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.163479 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.163517 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntd76\" (UniqueName: \"kubernetes.io/projected/b399a6db-7119-4897-a382-d0d8b717024d-kube-api-access-ntd76\") on node \"crc\" DevicePath \"\"" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.163528 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b399a6db-7119-4897-a382-d0d8b717024d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.937193 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zrmhq" event={"ID":"b399a6db-7119-4897-a382-d0d8b717024d","Type":"ContainerDied","Data":"a7a64972001c64af5496267af9d821155b0048e9f9304833708449aa98d8ceed"} Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.937689 5127 scope.go:117] "RemoveContainer" containerID="f8e4ecf51a2377da8009406554a4c5247d0a7b7e50be6473aaa2c44cc1d9ffaf" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.937287 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zrmhq" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.972351 5127 scope.go:117] "RemoveContainer" containerID="06da351c18534b12c1b1cf3e9d240ec9a71e7bc04d0486a25f0ba5a420905bb7" Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.979048 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrmhq"] Feb 01 08:19:54 crc kubenswrapper[5127]: I0201 08:19:54.991211 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zrmhq"] Feb 01 08:19:55 crc kubenswrapper[5127]: I0201 08:19:55.102774 5127 scope.go:117] "RemoveContainer" containerID="2a993f4a8f603d653578bd1fd9b4b1bb2f68caca01e7b40c8b67f95ee15dd710" Feb 01 08:19:56 crc kubenswrapper[5127]: I0201 08:19:56.247102 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b399a6db-7119-4897-a382-d0d8b717024d" path="/var/lib/kubelet/pods/b399a6db-7119-4897-a382-d0d8b717024d/volumes" Feb 01 08:20:01 crc kubenswrapper[5127]: I0201 08:20:01.236048 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:20:01 crc kubenswrapper[5127]: E0201 08:20:01.236762 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:20:13 crc kubenswrapper[5127]: I0201 08:20:13.235873 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:20:13 crc kubenswrapper[5127]: E0201 08:20:13.236871 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:20:26 crc kubenswrapper[5127]: I0201 08:20:26.236158 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:20:26 crc kubenswrapper[5127]: E0201 08:20:26.236936 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:20:40 crc kubenswrapper[5127]: I0201 08:20:40.243359 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:20:41 crc kubenswrapper[5127]: I0201 08:20:41.361182 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"e8546cf522e64bb70f00ae812b7bcad51e2ab924612aab28f74b72ec437b8682"} Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.468973 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-646j4"] Feb 01 08:21:05 crc kubenswrapper[5127]: E0201 08:21:05.470276 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="extract-content" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.470299 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="extract-content" Feb 01 08:21:05 crc kubenswrapper[5127]: E0201 08:21:05.470327 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="extract-utilities" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.470336 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="extract-utilities" Feb 01 08:21:05 crc kubenswrapper[5127]: E0201 08:21:05.470358 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="registry-server" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.470366 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="registry-server" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.470738 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b399a6db-7119-4897-a382-d0d8b717024d" containerName="registry-server" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.473078 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.488978 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-646j4"] Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.577182 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-utilities\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.577292 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c487\" (UniqueName: \"kubernetes.io/projected/bf6c1685-280e-4e93-b945-e5624be4570b-kube-api-access-4c487\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.577472 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-catalog-content\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.678710 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-catalog-content\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.679059 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-utilities\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.679201 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c487\" (UniqueName: \"kubernetes.io/projected/bf6c1685-280e-4e93-b945-e5624be4570b-kube-api-access-4c487\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.679536 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-utilities\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.679672 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-catalog-content\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.702640 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c487\" (UniqueName: \"kubernetes.io/projected/bf6c1685-280e-4e93-b945-e5624be4570b-kube-api-access-4c487\") pod \"certified-operators-646j4\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:05 crc kubenswrapper[5127]: I0201 08:21:05.823298 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:06 crc kubenswrapper[5127]: I0201 08:21:06.299868 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-646j4"] Feb 01 08:21:06 crc kubenswrapper[5127]: I0201 08:21:06.596853 5127 generic.go:334] "Generic (PLEG): container finished" podID="bf6c1685-280e-4e93-b945-e5624be4570b" containerID="74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b" exitCode=0 Feb 01 08:21:06 crc kubenswrapper[5127]: I0201 08:21:06.596938 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-646j4" event={"ID":"bf6c1685-280e-4e93-b945-e5624be4570b","Type":"ContainerDied","Data":"74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b"} Feb 01 08:21:06 crc kubenswrapper[5127]: I0201 08:21:06.598314 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-646j4" event={"ID":"bf6c1685-280e-4e93-b945-e5624be4570b","Type":"ContainerStarted","Data":"53e5998004ca5731c1c1a8274701dfe01623d755ff6e0424e71e6107ec0d10b7"} Feb 01 08:21:07 crc kubenswrapper[5127]: I0201 08:21:07.609707 5127 generic.go:334] "Generic (PLEG): container finished" podID="bf6c1685-280e-4e93-b945-e5624be4570b" containerID="3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f" exitCode=0 Feb 01 08:21:07 crc kubenswrapper[5127]: I0201 08:21:07.609778 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-646j4" event={"ID":"bf6c1685-280e-4e93-b945-e5624be4570b","Type":"ContainerDied","Data":"3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f"} Feb 01 08:21:08 crc kubenswrapper[5127]: I0201 08:21:08.625157 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-646j4" event={"ID":"bf6c1685-280e-4e93-b945-e5624be4570b","Type":"ContainerStarted","Data":"5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab"} Feb 01 08:21:08 crc kubenswrapper[5127]: I0201 08:21:08.664423 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-646j4" podStartSLOduration=2.244316769 podStartE2EDuration="3.664380543s" podCreationTimestamp="2026-02-01 08:21:05 +0000 UTC" firstStartedPulling="2026-02-01 08:21:06.599291076 +0000 UTC m=+5617.085193439" lastFinishedPulling="2026-02-01 08:21:08.01935485 +0000 UTC m=+5618.505257213" observedRunningTime="2026-02-01 08:21:08.657283703 +0000 UTC m=+5619.143186126" watchObservedRunningTime="2026-02-01 08:21:08.664380543 +0000 UTC m=+5619.150282946" Feb 01 08:21:15 crc kubenswrapper[5127]: I0201 08:21:15.823419 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:15 crc kubenswrapper[5127]: I0201 08:21:15.824394 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:15 crc kubenswrapper[5127]: I0201 08:21:15.894525 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:16 crc kubenswrapper[5127]: I0201 08:21:16.770152 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:16 crc kubenswrapper[5127]: I0201 08:21:16.836704 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-646j4"] Feb 01 08:21:18 crc kubenswrapper[5127]: I0201 08:21:18.714780 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-646j4" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="registry-server" containerID="cri-o://5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab" gracePeriod=2 Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.211063 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.312956 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c487\" (UniqueName: \"kubernetes.io/projected/bf6c1685-280e-4e93-b945-e5624be4570b-kube-api-access-4c487\") pod \"bf6c1685-280e-4e93-b945-e5624be4570b\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.313023 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-utilities\") pod \"bf6c1685-280e-4e93-b945-e5624be4570b\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.313160 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-catalog-content\") pod \"bf6c1685-280e-4e93-b945-e5624be4570b\" (UID: \"bf6c1685-280e-4e93-b945-e5624be4570b\") " Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.315563 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-utilities" (OuterVolumeSpecName: "utilities") pod "bf6c1685-280e-4e93-b945-e5624be4570b" (UID: "bf6c1685-280e-4e93-b945-e5624be4570b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.322595 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6c1685-280e-4e93-b945-e5624be4570b-kube-api-access-4c487" (OuterVolumeSpecName: "kube-api-access-4c487") pod "bf6c1685-280e-4e93-b945-e5624be4570b" (UID: "bf6c1685-280e-4e93-b945-e5624be4570b"). InnerVolumeSpecName "kube-api-access-4c487". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.355856 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf6c1685-280e-4e93-b945-e5624be4570b" (UID: "bf6c1685-280e-4e93-b945-e5624be4570b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.415230 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.415291 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c487\" (UniqueName: \"kubernetes.io/projected/bf6c1685-280e-4e93-b945-e5624be4570b-kube-api-access-4c487\") on node \"crc\" DevicePath \"\"" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.415311 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6c1685-280e-4e93-b945-e5624be4570b-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.725308 5127 generic.go:334] "Generic (PLEG): container finished" podID="bf6c1685-280e-4e93-b945-e5624be4570b" containerID="5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab" exitCode=0 Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.725347 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-646j4" event={"ID":"bf6c1685-280e-4e93-b945-e5624be4570b","Type":"ContainerDied","Data":"5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab"} Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.725372 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-646j4" event={"ID":"bf6c1685-280e-4e93-b945-e5624be4570b","Type":"ContainerDied","Data":"53e5998004ca5731c1c1a8274701dfe01623d755ff6e0424e71e6107ec0d10b7"} Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.725389 5127 scope.go:117] "RemoveContainer" containerID="5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.725394 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-646j4" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.774427 5127 scope.go:117] "RemoveContainer" containerID="3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.781012 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-646j4"] Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.796655 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-646j4"] Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.805989 5127 scope.go:117] "RemoveContainer" containerID="74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.831736 5127 scope.go:117] "RemoveContainer" containerID="5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab" Feb 01 08:21:19 crc kubenswrapper[5127]: E0201 08:21:19.832214 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab\": container with ID starting with 5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab not found: ID does not exist" containerID="5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.832255 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab"} err="failed to get container status \"5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab\": rpc error: code = NotFound desc = could not find container \"5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab\": container with ID starting with 5ed467c83adc2b40a8e59e36c9da3400cff3f7ca9d38cf54783620b3f5e045ab not found: ID does not exist" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.832286 5127 scope.go:117] "RemoveContainer" containerID="3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f" Feb 01 08:21:19 crc kubenswrapper[5127]: E0201 08:21:19.832766 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f\": container with ID starting with 3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f not found: ID does not exist" containerID="3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.832852 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f"} err="failed to get container status \"3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f\": rpc error: code = NotFound desc = could not find container \"3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f\": container with ID starting with 3dde302a6667b1a32c3235e7cb0b94f46499894a5bc61e22d1b7e10b7350345f not found: ID does not exist" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.832897 5127 scope.go:117] "RemoveContainer" containerID="74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b" Feb 01 08:21:19 crc kubenswrapper[5127]: E0201 08:21:19.833544 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b\": container with ID starting with 74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b not found: ID does not exist" containerID="74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b" Feb 01 08:21:19 crc kubenswrapper[5127]: I0201 08:21:19.833668 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b"} err="failed to get container status \"74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b\": rpc error: code = NotFound desc = could not find container \"74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b\": container with ID starting with 74bfe64354931a8dd2b7c999275263e6499b54e5269298d8a18f9e5c24e5765b not found: ID does not exist" Feb 01 08:21:20 crc kubenswrapper[5127]: I0201 08:21:20.251571 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" path="/var/lib/kubelet/pods/bf6c1685-280e-4e93-b945-e5624be4570b/volumes" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.813072 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7mj9s"] Feb 01 08:22:13 crc kubenswrapper[5127]: E0201 08:22:13.815229 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="extract-utilities" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.815270 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="extract-utilities" Feb 01 08:22:13 crc kubenswrapper[5127]: E0201 08:22:13.815309 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="registry-server" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.815329 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="registry-server" Feb 01 08:22:13 crc kubenswrapper[5127]: E0201 08:22:13.815368 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="extract-content" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.815387 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="extract-content" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.815763 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6c1685-280e-4e93-b945-e5624be4570b" containerName="registry-server" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.818421 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.826418 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mj9s"] Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.938011 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gtv5\" (UniqueName: \"kubernetes.io/projected/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-kube-api-access-9gtv5\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.938113 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-catalog-content\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:13 crc kubenswrapper[5127]: I0201 08:22:13.938180 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-utilities\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.039621 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-utilities\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.039990 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gtv5\" (UniqueName: \"kubernetes.io/projected/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-kube-api-access-9gtv5\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.040054 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-catalog-content\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.040421 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-utilities\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.040573 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-catalog-content\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.068715 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gtv5\" (UniqueName: \"kubernetes.io/projected/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-kube-api-access-9gtv5\") pod \"redhat-operators-7mj9s\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.146484 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:14 crc kubenswrapper[5127]: I0201 08:22:14.578972 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7mj9s"] Feb 01 08:22:15 crc kubenswrapper[5127]: I0201 08:22:15.222318 5127 generic.go:334] "Generic (PLEG): container finished" podID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerID="641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c" exitCode=0 Feb 01 08:22:15 crc kubenswrapper[5127]: I0201 08:22:15.222381 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj9s" event={"ID":"4a2b808a-c9e0-4aa7-abc9-5651323b18f5","Type":"ContainerDied","Data":"641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c"} Feb 01 08:22:15 crc kubenswrapper[5127]: I0201 08:22:15.222417 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj9s" event={"ID":"4a2b808a-c9e0-4aa7-abc9-5651323b18f5","Type":"ContainerStarted","Data":"96cc773691b03318dbaf227c1fd5a5ae4ac13e16f9da8f06fcf028acb1df8f59"} Feb 01 08:22:17 crc kubenswrapper[5127]: E0201 08:22:17.143649 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2b808a_c9e0_4aa7_abc9_5651323b18f5.slice/crio-conmon-a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2b808a_c9e0_4aa7_abc9_5651323b18f5.slice/crio-a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c.scope\": RecentStats: unable to find data in memory cache]" Feb 01 08:22:17 crc kubenswrapper[5127]: I0201 08:22:17.245545 5127 generic.go:334] "Generic (PLEG): container finished" podID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerID="a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c" exitCode=0 Feb 01 08:22:17 crc kubenswrapper[5127]: I0201 08:22:17.245598 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj9s" event={"ID":"4a2b808a-c9e0-4aa7-abc9-5651323b18f5","Type":"ContainerDied","Data":"a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c"} Feb 01 08:22:19 crc kubenswrapper[5127]: I0201 08:22:19.275018 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj9s" event={"ID":"4a2b808a-c9e0-4aa7-abc9-5651323b18f5","Type":"ContainerStarted","Data":"8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e"} Feb 01 08:22:19 crc kubenswrapper[5127]: I0201 08:22:19.310715 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7mj9s" podStartSLOduration=3.53610076 podStartE2EDuration="6.31069339s" podCreationTimestamp="2026-02-01 08:22:13 +0000 UTC" firstStartedPulling="2026-02-01 08:22:15.224036762 +0000 UTC m=+5685.709939125" lastFinishedPulling="2026-02-01 08:22:17.998629352 +0000 UTC m=+5688.484531755" observedRunningTime="2026-02-01 08:22:19.307417452 +0000 UTC m=+5689.793319865" watchObservedRunningTime="2026-02-01 08:22:19.31069339 +0000 UTC m=+5689.796595793" Feb 01 08:22:24 crc kubenswrapper[5127]: I0201 08:22:24.147703 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:24 crc kubenswrapper[5127]: I0201 08:22:24.148363 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:25 crc kubenswrapper[5127]: I0201 08:22:25.192401 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7mj9s" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="registry-server" probeResult="failure" output=< Feb 01 08:22:25 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 08:22:25 crc kubenswrapper[5127]: > Feb 01 08:22:34 crc kubenswrapper[5127]: I0201 08:22:34.224008 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:34 crc kubenswrapper[5127]: I0201 08:22:34.291873 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:34 crc kubenswrapper[5127]: I0201 08:22:34.483683 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mj9s"] Feb 01 08:22:35 crc kubenswrapper[5127]: I0201 08:22:35.402333 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7mj9s" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="registry-server" containerID="cri-o://8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e" gracePeriod=2 Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.355205 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.412981 5127 generic.go:334] "Generic (PLEG): container finished" podID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerID="8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e" exitCode=0 Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.413060 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj9s" event={"ID":"4a2b808a-c9e0-4aa7-abc9-5651323b18f5","Type":"ContainerDied","Data":"8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e"} Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.413103 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7mj9s" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.413134 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7mj9s" event={"ID":"4a2b808a-c9e0-4aa7-abc9-5651323b18f5","Type":"ContainerDied","Data":"96cc773691b03318dbaf227c1fd5a5ae4ac13e16f9da8f06fcf028acb1df8f59"} Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.413176 5127 scope.go:117] "RemoveContainer" containerID="8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.439466 5127 scope.go:117] "RemoveContainer" containerID="a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.465441 5127 scope.go:117] "RemoveContainer" containerID="641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.499135 5127 scope.go:117] "RemoveContainer" containerID="8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e" Feb 01 08:22:36 crc kubenswrapper[5127]: E0201 08:22:36.499754 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e\": container with ID starting with 8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e not found: ID does not exist" containerID="8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.499814 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e"} err="failed to get container status \"8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e\": rpc error: code = NotFound desc = could not find container \"8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e\": container with ID starting with 8e25779902bc10d4563aca821a1697c9bd0176a09ae46962429716e0e46a335e not found: ID does not exist" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.499849 5127 scope.go:117] "RemoveContainer" containerID="a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c" Feb 01 08:22:36 crc kubenswrapper[5127]: E0201 08:22:36.500308 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c\": container with ID starting with a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c not found: ID does not exist" containerID="a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.500360 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c"} err="failed to get container status \"a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c\": rpc error: code = NotFound desc = could not find container \"a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c\": container with ID starting with a4a5ae3a3fbe2659cfc8d2e5281cf724dd155562567e98b50b9e639b4dbff78c not found: ID does not exist" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.500396 5127 scope.go:117] "RemoveContainer" containerID="641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c" Feb 01 08:22:36 crc kubenswrapper[5127]: E0201 08:22:36.500873 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c\": container with ID starting with 641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c not found: ID does not exist" containerID="641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.500917 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c"} err="failed to get container status \"641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c\": rpc error: code = NotFound desc = could not find container \"641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c\": container with ID starting with 641f227dc54ddde52a375bb8e85db3d63e409a1a64bd798af3726abc72dd9a7c not found: ID does not exist" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.546215 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gtv5\" (UniqueName: \"kubernetes.io/projected/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-kube-api-access-9gtv5\") pod \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.546358 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-catalog-content\") pod \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.546606 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-utilities\") pod \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\" (UID: \"4a2b808a-c9e0-4aa7-abc9-5651323b18f5\") " Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.547970 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-utilities" (OuterVolumeSpecName: "utilities") pod "4a2b808a-c9e0-4aa7-abc9-5651323b18f5" (UID: "4a2b808a-c9e0-4aa7-abc9-5651323b18f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.555679 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-kube-api-access-9gtv5" (OuterVolumeSpecName: "kube-api-access-9gtv5") pod "4a2b808a-c9e0-4aa7-abc9-5651323b18f5" (UID: "4a2b808a-c9e0-4aa7-abc9-5651323b18f5"). InnerVolumeSpecName "kube-api-access-9gtv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.649022 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gtv5\" (UniqueName: \"kubernetes.io/projected/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-kube-api-access-9gtv5\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.649058 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.673818 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a2b808a-c9e0-4aa7-abc9-5651323b18f5" (UID: "4a2b808a-c9e0-4aa7-abc9-5651323b18f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.750665 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a2b808a-c9e0-4aa7-abc9-5651323b18f5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.752277 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7mj9s"] Feb 01 08:22:36 crc kubenswrapper[5127]: I0201 08:22:36.759787 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7mj9s"] Feb 01 08:22:38 crc kubenswrapper[5127]: I0201 08:22:38.249488 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" path="/var/lib/kubelet/pods/4a2b808a-c9e0-4aa7-abc9-5651323b18f5/volumes" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.071070 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-4n6s2"] Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.079924 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-4n6s2"] Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.169893 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-kxwjm"] Feb 01 08:22:46 crc kubenswrapper[5127]: E0201 08:22:46.170202 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="registry-server" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.170219 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="registry-server" Feb 01 08:22:46 crc kubenswrapper[5127]: E0201 08:22:46.170240 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="extract-content" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.170248 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="extract-content" Feb 01 08:22:46 crc kubenswrapper[5127]: E0201 08:22:46.170267 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="extract-utilities" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.170273 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="extract-utilities" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.170443 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2b808a-c9e0-4aa7-abc9-5651323b18f5" containerName="registry-server" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.171014 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.175057 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.177065 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.177281 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.180061 5127 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wxj58" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.194055 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-kxwjm"] Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.200003 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/61f3dbe6-d5b0-475d-8aea-d8190c267d12-crc-storage\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.200069 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxfmp\" (UniqueName: \"kubernetes.io/projected/61f3dbe6-d5b0-475d-8aea-d8190c267d12-kube-api-access-mxfmp\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.200130 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/61f3dbe6-d5b0-475d-8aea-d8190c267d12-node-mnt\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.242759 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377e5b6c-e0fb-4e24-8b4d-19f66394ee94" path="/var/lib/kubelet/pods/377e5b6c-e0fb-4e24-8b4d-19f66394ee94/volumes" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.301144 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/61f3dbe6-d5b0-475d-8aea-d8190c267d12-crc-storage\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.302048 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/61f3dbe6-d5b0-475d-8aea-d8190c267d12-crc-storage\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.302069 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxfmp\" (UniqueName: \"kubernetes.io/projected/61f3dbe6-d5b0-475d-8aea-d8190c267d12-kube-api-access-mxfmp\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.302353 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/61f3dbe6-d5b0-475d-8aea-d8190c267d12-node-mnt\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.302593 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/61f3dbe6-d5b0-475d-8aea-d8190c267d12-node-mnt\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.331004 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxfmp\" (UniqueName: \"kubernetes.io/projected/61f3dbe6-d5b0-475d-8aea-d8190c267d12-kube-api-access-mxfmp\") pod \"crc-storage-crc-kxwjm\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.498676 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:46 crc kubenswrapper[5127]: I0201 08:22:46.825239 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-kxwjm"] Feb 01 08:22:47 crc kubenswrapper[5127]: I0201 08:22:47.518270 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-kxwjm" event={"ID":"61f3dbe6-d5b0-475d-8aea-d8190c267d12","Type":"ContainerStarted","Data":"a4f6dd18fed9b260bde2d3f024a9919f17a73a359c70b5a50e01f5245bab08ca"} Feb 01 08:22:47 crc kubenswrapper[5127]: I0201 08:22:47.518800 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-kxwjm" event={"ID":"61f3dbe6-d5b0-475d-8aea-d8190c267d12","Type":"ContainerStarted","Data":"39c3b6466b2f192082333fa4bc557aeefdc1be7cc3e79485cf17b4044a8b0426"} Feb 01 08:22:47 crc kubenswrapper[5127]: I0201 08:22:47.544614 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-kxwjm" podStartSLOduration=1.1351394830000001 podStartE2EDuration="1.544574296s" podCreationTimestamp="2026-02-01 08:22:46 +0000 UTC" firstStartedPulling="2026-02-01 08:22:46.836899993 +0000 UTC m=+5717.322802366" lastFinishedPulling="2026-02-01 08:22:47.246334786 +0000 UTC m=+5717.732237179" observedRunningTime="2026-02-01 08:22:47.537691372 +0000 UTC m=+5718.023593765" watchObservedRunningTime="2026-02-01 08:22:47.544574296 +0000 UTC m=+5718.030476679" Feb 01 08:22:48 crc kubenswrapper[5127]: I0201 08:22:48.533696 5127 generic.go:334] "Generic (PLEG): container finished" podID="61f3dbe6-d5b0-475d-8aea-d8190c267d12" containerID="a4f6dd18fed9b260bde2d3f024a9919f17a73a359c70b5a50e01f5245bab08ca" exitCode=0 Feb 01 08:22:48 crc kubenswrapper[5127]: I0201 08:22:48.534191 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-kxwjm" event={"ID":"61f3dbe6-d5b0-475d-8aea-d8190c267d12","Type":"ContainerDied","Data":"a4f6dd18fed9b260bde2d3f024a9919f17a73a359c70b5a50e01f5245bab08ca"} Feb 01 08:22:49 crc kubenswrapper[5127]: I0201 08:22:49.994977 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.107747 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxfmp\" (UniqueName: \"kubernetes.io/projected/61f3dbe6-d5b0-475d-8aea-d8190c267d12-kube-api-access-mxfmp\") pod \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.107924 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/61f3dbe6-d5b0-475d-8aea-d8190c267d12-crc-storage\") pod \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.108160 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/61f3dbe6-d5b0-475d-8aea-d8190c267d12-node-mnt\") pod \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\" (UID: \"61f3dbe6-d5b0-475d-8aea-d8190c267d12\") " Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.108260 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61f3dbe6-d5b0-475d-8aea-d8190c267d12-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "61f3dbe6-d5b0-475d-8aea-d8190c267d12" (UID: "61f3dbe6-d5b0-475d-8aea-d8190c267d12"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.108724 5127 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/61f3dbe6-d5b0-475d-8aea-d8190c267d12-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.117133 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f3dbe6-d5b0-475d-8aea-d8190c267d12-kube-api-access-mxfmp" (OuterVolumeSpecName: "kube-api-access-mxfmp") pod "61f3dbe6-d5b0-475d-8aea-d8190c267d12" (UID: "61f3dbe6-d5b0-475d-8aea-d8190c267d12"). InnerVolumeSpecName "kube-api-access-mxfmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.138903 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61f3dbe6-d5b0-475d-8aea-d8190c267d12-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "61f3dbe6-d5b0-475d-8aea-d8190c267d12" (UID: "61f3dbe6-d5b0-475d-8aea-d8190c267d12"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.210563 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxfmp\" (UniqueName: \"kubernetes.io/projected/61f3dbe6-d5b0-475d-8aea-d8190c267d12-kube-api-access-mxfmp\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.210640 5127 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/61f3dbe6-d5b0-475d-8aea-d8190c267d12-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.562185 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-kxwjm" event={"ID":"61f3dbe6-d5b0-475d-8aea-d8190c267d12","Type":"ContainerDied","Data":"39c3b6466b2f192082333fa4bc557aeefdc1be7cc3e79485cf17b4044a8b0426"} Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.562261 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39c3b6466b2f192082333fa4bc557aeefdc1be7cc3e79485cf17b4044a8b0426" Feb 01 08:22:50 crc kubenswrapper[5127]: I0201 08:22:50.562373 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-kxwjm" Feb 01 08:22:51 crc kubenswrapper[5127]: I0201 08:22:51.898926 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-kxwjm"] Feb 01 08:22:51 crc kubenswrapper[5127]: I0201 08:22:51.910756 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-kxwjm"] Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.074175 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-psrnc"] Feb 01 08:22:52 crc kubenswrapper[5127]: E0201 08:22:52.074669 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f3dbe6-d5b0-475d-8aea-d8190c267d12" containerName="storage" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.074695 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f3dbe6-d5b0-475d-8aea-d8190c267d12" containerName="storage" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.074901 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f3dbe6-d5b0-475d-8aea-d8190c267d12" containerName="storage" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.075481 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.077942 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.078457 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.079228 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.083949 5127 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-wxj58" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.084568 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-psrnc"] Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.150792 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6334986f-3125-4611-bf58-cde7978e186d-crc-storage\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.150924 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6334986f-3125-4611-bf58-cde7978e186d-node-mnt\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.151012 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fzg6\" (UniqueName: \"kubernetes.io/projected/6334986f-3125-4611-bf58-cde7978e186d-kube-api-access-8fzg6\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.249526 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f3dbe6-d5b0-475d-8aea-d8190c267d12" path="/var/lib/kubelet/pods/61f3dbe6-d5b0-475d-8aea-d8190c267d12/volumes" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.252439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fzg6\" (UniqueName: \"kubernetes.io/projected/6334986f-3125-4611-bf58-cde7978e186d-kube-api-access-8fzg6\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.252612 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6334986f-3125-4611-bf58-cde7978e186d-crc-storage\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.252670 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6334986f-3125-4611-bf58-cde7978e186d-node-mnt\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.252984 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6334986f-3125-4611-bf58-cde7978e186d-node-mnt\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.253478 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6334986f-3125-4611-bf58-cde7978e186d-crc-storage\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.280675 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fzg6\" (UniqueName: \"kubernetes.io/projected/6334986f-3125-4611-bf58-cde7978e186d-kube-api-access-8fzg6\") pod \"crc-storage-crc-psrnc\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:52 crc kubenswrapper[5127]: I0201 08:22:52.433786 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:53 crc kubenswrapper[5127]: I0201 08:22:53.320408 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-psrnc"] Feb 01 08:22:53 crc kubenswrapper[5127]: I0201 08:22:53.593651 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psrnc" event={"ID":"6334986f-3125-4611-bf58-cde7978e186d","Type":"ContainerStarted","Data":"31873b2fdf247d6a9623dc498d555130bc8576eed55be2fc8337fbdddd14f437"} Feb 01 08:22:54 crc kubenswrapper[5127]: I0201 08:22:54.608305 5127 generic.go:334] "Generic (PLEG): container finished" podID="6334986f-3125-4611-bf58-cde7978e186d" containerID="36cdf42fb987c45e817f37d5975d0900df2906feeba96d872b407b2a34bf7338" exitCode=0 Feb 01 08:22:54 crc kubenswrapper[5127]: I0201 08:22:54.608449 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psrnc" event={"ID":"6334986f-3125-4611-bf58-cde7978e186d","Type":"ContainerDied","Data":"36cdf42fb987c45e817f37d5975d0900df2906feeba96d872b407b2a34bf7338"} Feb 01 08:22:55 crc kubenswrapper[5127]: I0201 08:22:55.970541 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.037250 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fzg6\" (UniqueName: \"kubernetes.io/projected/6334986f-3125-4611-bf58-cde7978e186d-kube-api-access-8fzg6\") pod \"6334986f-3125-4611-bf58-cde7978e186d\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.037468 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6334986f-3125-4611-bf58-cde7978e186d-crc-storage\") pod \"6334986f-3125-4611-bf58-cde7978e186d\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.037543 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6334986f-3125-4611-bf58-cde7978e186d-node-mnt\") pod \"6334986f-3125-4611-bf58-cde7978e186d\" (UID: \"6334986f-3125-4611-bf58-cde7978e186d\") " Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.037881 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6334986f-3125-4611-bf58-cde7978e186d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6334986f-3125-4611-bf58-cde7978e186d" (UID: "6334986f-3125-4611-bf58-cde7978e186d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.038372 5127 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6334986f-3125-4611-bf58-cde7978e186d-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.043968 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6334986f-3125-4611-bf58-cde7978e186d-kube-api-access-8fzg6" (OuterVolumeSpecName: "kube-api-access-8fzg6") pod "6334986f-3125-4611-bf58-cde7978e186d" (UID: "6334986f-3125-4611-bf58-cde7978e186d"). InnerVolumeSpecName "kube-api-access-8fzg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.070734 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6334986f-3125-4611-bf58-cde7978e186d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6334986f-3125-4611-bf58-cde7978e186d" (UID: "6334986f-3125-4611-bf58-cde7978e186d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.139972 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fzg6\" (UniqueName: \"kubernetes.io/projected/6334986f-3125-4611-bf58-cde7978e186d-kube-api-access-8fzg6\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.140239 5127 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6334986f-3125-4611-bf58-cde7978e186d-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.635191 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psrnc" event={"ID":"6334986f-3125-4611-bf58-cde7978e186d","Type":"ContainerDied","Data":"31873b2fdf247d6a9623dc498d555130bc8576eed55be2fc8337fbdddd14f437"} Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.635555 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31873b2fdf247d6a9623dc498d555130bc8576eed55be2fc8337fbdddd14f437" Feb 01 08:22:56 crc kubenswrapper[5127]: I0201 08:22:56.635305 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psrnc" Feb 01 08:23:06 crc kubenswrapper[5127]: I0201 08:23:06.740636 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:23:06 crc kubenswrapper[5127]: I0201 08:23:06.741458 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:23:36 crc kubenswrapper[5127]: I0201 08:23:36.741356 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:23:36 crc kubenswrapper[5127]: I0201 08:23:36.741964 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:23:41 crc kubenswrapper[5127]: I0201 08:23:41.785505 5127 scope.go:117] "RemoveContainer" containerID="3f0de04c7ee6cbadfdd33a4d628cf344e5cc5c64b0f94574b5ece8c9a0e85473" Feb 01 08:24:06 crc kubenswrapper[5127]: I0201 08:24:06.741649 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:24:06 crc kubenswrapper[5127]: I0201 08:24:06.742330 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:24:06 crc kubenswrapper[5127]: I0201 08:24:06.742404 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:24:06 crc kubenswrapper[5127]: I0201 08:24:06.743334 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8546cf522e64bb70f00ae812b7bcad51e2ab924612aab28f74b72ec437b8682"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:24:06 crc kubenswrapper[5127]: I0201 08:24:06.743439 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://e8546cf522e64bb70f00ae812b7bcad51e2ab924612aab28f74b72ec437b8682" gracePeriod=600 Feb 01 08:24:07 crc kubenswrapper[5127]: I0201 08:24:07.268913 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="e8546cf522e64bb70f00ae812b7bcad51e2ab924612aab28f74b72ec437b8682" exitCode=0 Feb 01 08:24:07 crc kubenswrapper[5127]: I0201 08:24:07.269001 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"e8546cf522e64bb70f00ae812b7bcad51e2ab924612aab28f74b72ec437b8682"} Feb 01 08:24:07 crc kubenswrapper[5127]: I0201 08:24:07.269180 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037"} Feb 01 08:24:07 crc kubenswrapper[5127]: I0201 08:24:07.269207 5127 scope.go:117] "RemoveContainer" containerID="36896d75fac77f226864134e9e4f249025ccd78c7e22c161801c49d1bd5c2974" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.359846 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65fd7d585f-kxdz8"] Feb 01 08:25:04 crc kubenswrapper[5127]: E0201 08:25:04.360706 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6334986f-3125-4611-bf58-cde7978e186d" containerName="storage" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.360723 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6334986f-3125-4611-bf58-cde7978e186d" containerName="storage" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.360918 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6334986f-3125-4611-bf58-cde7978e186d" containerName="storage" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.361867 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.364065 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.364266 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-krcf7" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.364571 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.365475 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.365554 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.396031 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65fd7d585f-kxdz8"] Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.423969 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7z46\" (UniqueName: \"kubernetes.io/projected/ca9690af-3744-442b-bae6-7662e965149e-kube-api-access-d7z46\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.424018 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-config\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.424134 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-dns-svc\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.525728 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-dns-svc\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.525788 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7z46\" (UniqueName: \"kubernetes.io/projected/ca9690af-3744-442b-bae6-7662e965149e-kube-api-access-d7z46\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.525805 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-config\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.526643 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-config\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.526929 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-dns-svc\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.538728 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d5cc45bbc-s556r"] Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.539853 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.560307 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5cc45bbc-s556r"] Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.568744 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7z46\" (UniqueName: \"kubernetes.io/projected/ca9690af-3744-442b-bae6-7662e965149e-kube-api-access-d7z46\") pod \"dnsmasq-dns-65fd7d585f-kxdz8\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.627043 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hk2r\" (UniqueName: \"kubernetes.io/projected/98e9f297-bf26-40be-8ad6-281852730569-kube-api-access-8hk2r\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.627118 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-config\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.627301 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-dns-svc\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.682273 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.728218 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hk2r\" (UniqueName: \"kubernetes.io/projected/98e9f297-bf26-40be-8ad6-281852730569-kube-api-access-8hk2r\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.728505 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-config\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.728546 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-dns-svc\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.730728 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-config\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.730832 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-dns-svc\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.746388 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hk2r\" (UniqueName: \"kubernetes.io/projected/98e9f297-bf26-40be-8ad6-281852730569-kube-api-access-8hk2r\") pod \"dnsmasq-dns-d5cc45bbc-s556r\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:04 crc kubenswrapper[5127]: I0201 08:25:04.854453 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.134619 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65fd7d585f-kxdz8"] Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.144034 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.267968 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d5cc45bbc-s556r"] Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.435764 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.437062 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.438885 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.439125 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.439611 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.439638 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.439674 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vgk6b" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.462179 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541367 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541434 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541470 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541500 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35805b39-1109-45f6-a3eb-41804335ad1a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541708 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35805b39-1109-45f6-a3eb-41804335ad1a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541763 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqsk\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-kube-api-access-4nqsk\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541821 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541918 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.541967 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643071 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643214 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643258 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643285 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35805b39-1109-45f6-a3eb-41804335ad1a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643337 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35805b39-1109-45f6-a3eb-41804335ad1a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643356 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqsk\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-kube-api-access-4nqsk\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643381 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643412 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643434 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.643663 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.644526 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.645054 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.645691 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.647968 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.648007 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88e60ebfd736eab7c95666109ce6af3bc7b62ad4c675fb78d10cca87ffe539bd/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.649249 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35805b39-1109-45f6-a3eb-41804335ad1a-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.649906 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35805b39-1109-45f6-a3eb-41804335ad1a-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.650493 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.663885 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqsk\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-kube-api-access-4nqsk\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.673115 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.704241 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.707255 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.709278 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.709795 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.710113 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.710335 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9ztnb" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.710998 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.713239 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.778809 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847514 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847563 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847600 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847620 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847645 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847715 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847734 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847777 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nw9\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-kube-api-access-g7nw9\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.847824 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.857838 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" event={"ID":"ca9690af-3744-442b-bae6-7662e965149e","Type":"ContainerStarted","Data":"727507f09341033f1945a4b383c6829b2cc26592421b625139fd58030d2c2ea2"} Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.860860 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" event={"ID":"98e9f297-bf26-40be-8ad6-281852730569","Type":"ContainerStarted","Data":"87dd9c4d48242748a2e89621bb50fb4c2c47d10949335dbaf466a4d2bcfee74f"} Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.957871 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.956431 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960286 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960330 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960361 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960515 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960681 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nw9\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-kube-api-access-g7nw9\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.960770 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.968022 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.968742 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.969359 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.969658 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.971489 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:05 crc kubenswrapper[5127]: I0201 08:25:05.972610 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.002562 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nw9\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-kube-api-access-g7nw9\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.003879 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.003914 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9cecaf4f1d642cec9adaf1e51c239b9030e1fd6747f1ef12f89f749c4cf42182/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.224395 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.347548 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.407346 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:25:06 crc kubenswrapper[5127]: W0201 08:25:06.438282 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35805b39_1109_45f6_a3eb_41804335ad1a.slice/crio-0ee045a4942b41df2263008ec9e5726e190cde1b99058c6ca1cbb1948c5a210b WatchSource:0}: Error finding container 0ee045a4942b41df2263008ec9e5726e190cde1b99058c6ca1cbb1948c5a210b: Status 404 returned error can't find the container with id 0ee045a4942b41df2263008ec9e5726e190cde1b99058c6ca1cbb1948c5a210b Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.871826 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:25:06 crc kubenswrapper[5127]: I0201 08:25:06.873014 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35805b39-1109-45f6-a3eb-41804335ad1a","Type":"ContainerStarted","Data":"0ee045a4942b41df2263008ec9e5726e190cde1b99058c6ca1cbb1948c5a210b"} Feb 01 08:25:06 crc kubenswrapper[5127]: W0201 08:25:06.885238 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb54d8b4c_18e6_4d64_ae91_4434f3eb3552.slice/crio-c7a3c50645c7f49447216d33ea9eceba29fe56a0647d43aebb99bdad839a9e0c WatchSource:0}: Error finding container c7a3c50645c7f49447216d33ea9eceba29fe56a0647d43aebb99bdad839a9e0c: Status 404 returned error can't find the container with id c7a3c50645c7f49447216d33ea9eceba29fe56a0647d43aebb99bdad839a9e0c Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.050172 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.061519 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.068860 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bmvnm" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.069087 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.070396 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.071059 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.079850 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.082444 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180184 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-kolla-config\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180275 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-config-data-default\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180309 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cefe36-3a26-4a7e-add3-b445cb590fe5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180362 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prt7h\" (UniqueName: \"kubernetes.io/projected/82cefe36-3a26-4a7e-add3-b445cb590fe5-kube-api-access-prt7h\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180408 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180517 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4b09dcf9-45af-4182-b387-1036a157dc69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b09dcf9-45af-4182-b387-1036a157dc69\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180548 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/82cefe36-3a26-4a7e-add3-b445cb590fe5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.180565 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cefe36-3a26-4a7e-add3-b445cb590fe5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.283825 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/82cefe36-3a26-4a7e-add3-b445cb590fe5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.283877 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cefe36-3a26-4a7e-add3-b445cb590fe5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.283924 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-kolla-config\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.283946 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-config-data-default\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.283970 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cefe36-3a26-4a7e-add3-b445cb590fe5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.283994 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prt7h\" (UniqueName: \"kubernetes.io/projected/82cefe36-3a26-4a7e-add3-b445cb590fe5-kube-api-access-prt7h\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.284050 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.284109 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4b09dcf9-45af-4182-b387-1036a157dc69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b09dcf9-45af-4182-b387-1036a157dc69\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.285232 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/82cefe36-3a26-4a7e-add3-b445cb590fe5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.287197 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-kolla-config\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.287640 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-config-data-default\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.289333 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82cefe36-3a26-4a7e-add3-b445cb590fe5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.290971 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82cefe36-3a26-4a7e-add3-b445cb590fe5-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.315253 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/82cefe36-3a26-4a7e-add3-b445cb590fe5-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.322069 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prt7h\" (UniqueName: \"kubernetes.io/projected/82cefe36-3a26-4a7e-add3-b445cb590fe5-kube-api-access-prt7h\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.324117 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.324167 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4b09dcf9-45af-4182-b387-1036a157dc69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b09dcf9-45af-4182-b387-1036a157dc69\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/266438222af95a9bce5a9f4ebead5b14526677f25a40ba0ebb6bc7706a5f364a/globalmount\"" pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.357905 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4b09dcf9-45af-4182-b387-1036a157dc69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4b09dcf9-45af-4182-b387-1036a157dc69\") pod \"openstack-galera-0\" (UID: \"82cefe36-3a26-4a7e-add3-b445cb590fe5\") " pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.398553 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.462552 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.465397 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.470700 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.470717 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-tznbd" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.477429 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.514437 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-config-data\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.514502 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmq6d\" (UniqueName: \"kubernetes.io/projected/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-kube-api-access-qmq6d\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.514553 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-kolla-config\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.617300 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-kolla-config\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.617377 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-config-data\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.617476 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmq6d\" (UniqueName: \"kubernetes.io/projected/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-kube-api-access-qmq6d\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.618046 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-kolla-config\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.618559 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-config-data\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.638496 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmq6d\" (UniqueName: \"kubernetes.io/projected/2bc6b5e9-8c7b-4144-b943-e57514d1f11f-kube-api-access-qmq6d\") pod \"memcached-0\" (UID: \"2bc6b5e9-8c7b-4144-b943-e57514d1f11f\") " pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.851928 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.896805 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b54d8b4c-18e6-4d64-ae91-4434f3eb3552","Type":"ContainerStarted","Data":"c7a3c50645c7f49447216d33ea9eceba29fe56a0647d43aebb99bdad839a9e0c"} Feb 01 08:25:07 crc kubenswrapper[5127]: I0201 08:25:07.896967 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 01 08:25:07 crc kubenswrapper[5127]: W0201 08:25:07.927683 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82cefe36_3a26_4a7e_add3_b445cb590fe5.slice/crio-226f3a9049945719a13d74841e0904345cb827f9d260238c0ffae4bc33beecf9 WatchSource:0}: Error finding container 226f3a9049945719a13d74841e0904345cb827f9d260238c0ffae4bc33beecf9: Status 404 returned error can't find the container with id 226f3a9049945719a13d74841e0904345cb827f9d260238c0ffae4bc33beecf9 Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.361284 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 01 08:25:08 crc kubenswrapper[5127]: W0201 08:25:08.366234 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bc6b5e9_8c7b_4144_b943_e57514d1f11f.slice/crio-f34d17f7a861c88a3f04a8f820ae22ed230b34254da5c7df7c63c985f2364ffc WatchSource:0}: Error finding container f34d17f7a861c88a3f04a8f820ae22ed230b34254da5c7df7c63c985f2364ffc: Status 404 returned error can't find the container with id f34d17f7a861c88a3f04a8f820ae22ed230b34254da5c7df7c63c985f2364ffc Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.645512 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.647524 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.655373 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.661179 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.661460 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.662517 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-jjrqr" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.671275 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743258 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743337 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42fff15-1ed7-468f-bf75-609929079667-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743364 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743411 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcrl\" (UniqueName: \"kubernetes.io/projected/d42fff15-1ed7-468f-bf75-609929079667-kube-api-access-8lcrl\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743435 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743461 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d42fff15-1ed7-468f-bf75-609929079667-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743480 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.743497 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fff15-1ed7-468f-bf75-609929079667-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.844871 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcrl\" (UniqueName: \"kubernetes.io/projected/d42fff15-1ed7-468f-bf75-609929079667-kube-api-access-8lcrl\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.844940 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.844989 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d42fff15-1ed7-468f-bf75-609929079667-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.845021 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.845043 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fff15-1ed7-468f-bf75-609929079667-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.845104 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.845157 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42fff15-1ed7-468f-bf75-609929079667-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.845189 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.846028 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d42fff15-1ed7-468f-bf75-609929079667-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.847006 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.847109 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.847648 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d42fff15-1ed7-468f-bf75-609929079667-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.848622 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.848651 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/075507d696ef2360e266c6820157cc1019414db7c77e4c53bcbe0020187a1b16/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.850096 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d42fff15-1ed7-468f-bf75-609929079667-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.850810 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d42fff15-1ed7-468f-bf75-609929079667-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.868374 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcrl\" (UniqueName: \"kubernetes.io/projected/d42fff15-1ed7-468f-bf75-609929079667-kube-api-access-8lcrl\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.890392 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-534d56af-0053-4cd3-ab24-ea0163ef2ae6\") pod \"openstack-cell1-galera-0\" (UID: \"d42fff15-1ed7-468f-bf75-609929079667\") " pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.921399 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2bc6b5e9-8c7b-4144-b943-e57514d1f11f","Type":"ContainerStarted","Data":"f34d17f7a861c88a3f04a8f820ae22ed230b34254da5c7df7c63c985f2364ffc"} Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.929168 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"82cefe36-3a26-4a7e-add3-b445cb590fe5","Type":"ContainerStarted","Data":"226f3a9049945719a13d74841e0904345cb827f9d260238c0ffae4bc33beecf9"} Feb 01 08:25:08 crc kubenswrapper[5127]: I0201 08:25:08.977427 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:09 crc kubenswrapper[5127]: W0201 08:25:09.401799 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd42fff15_1ed7_468f_bf75_609929079667.slice/crio-9d043294c1326cce0f5795558e7cf96836f1f7371fc578de35306d0f0c65f1e0 WatchSource:0}: Error finding container 9d043294c1326cce0f5795558e7cf96836f1f7371fc578de35306d0f0c65f1e0: Status 404 returned error can't find the container with id 9d043294c1326cce0f5795558e7cf96836f1f7371fc578de35306d0f0c65f1e0 Feb 01 08:25:09 crc kubenswrapper[5127]: I0201 08:25:09.402098 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 01 08:25:09 crc kubenswrapper[5127]: I0201 08:25:09.941703 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d42fff15-1ed7-468f-bf75-609929079667","Type":"ContainerStarted","Data":"9d043294c1326cce0f5795558e7cf96836f1f7371fc578de35306d0f0c65f1e0"} Feb 01 08:25:27 crc kubenswrapper[5127]: E0201 08:25:27.252803 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:27 crc kubenswrapper[5127]: E0201 08:25:27.253473 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:27 crc kubenswrapper[5127]: E0201 08:25:27.253694 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8a0e02dd0fb8f726038072d0e3af1871,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prt7h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(82cefe36-3a26-4a7e-add3-b445cb590fe5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 08:25:27 crc kubenswrapper[5127]: E0201 08:25:27.255016 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="82cefe36-3a26-4a7e-add3-b445cb590fe5" Feb 01 08:25:28 crc kubenswrapper[5127]: E0201 08:25:28.102196 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/openstack-galera-0" podUID="82cefe36-3a26-4a7e-add3-b445cb590fe5" Feb 01 08:25:28 crc kubenswrapper[5127]: E0201 08:25:28.482949 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:28 crc kubenswrapper[5127]: E0201 08:25:28.483012 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:28 crc kubenswrapper[5127]: E0201 08:25:28.483201 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4nqsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(35805b39-1109-45f6-a3eb-41804335ad1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 08:25:28 crc kubenswrapper[5127]: E0201 08:25:28.484437 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" Feb 01 08:25:29 crc kubenswrapper[5127]: E0201 08:25:29.110540 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/rabbitmq-server-0" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" Feb 01 08:25:29 crc kubenswrapper[5127]: E0201 08:25:29.315328 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:29 crc kubenswrapper[5127]: E0201 08:25:29.315396 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:29 crc kubenswrapper[5127]: E0201 08:25:29.315541 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8a0e02dd0fb8f726038072d0e3af1871,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nd9h78h5cbh78h5cbh649hch596h564h65dh57fh546hcbh568h695h5f8h544h54h5f5h5d4h665h59dhffh688h64h89h679hddh5cdh5c7hc7h544q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmq6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(2bc6b5e9-8c7b-4144-b943-e57514d1f11f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 08:25:29 crc kubenswrapper[5127]: E0201 08:25:29.321680 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="2bc6b5e9-8c7b-4144-b943-e57514d1f11f" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.119152 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/memcached-0" podUID="2bc6b5e9-8c7b-4144-b943-e57514d1f11f" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.215399 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.215448 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.215561 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g7nw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(b54d8b4c-18e6-4d64-ae91-4434f3eb3552): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.216782 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.242792 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.242844 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.242975 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d7z46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-65fd7d585f-kxdz8_openstack(ca9690af-3744-442b-bae6-7662e965149e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.244342 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" podUID="ca9690af-3744-442b-bae6-7662e965149e" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.254883 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.254999 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.255790 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h564h676h699hcdh67bh66hfdh569h545h648h94h546h696h668h89h96h667h575h595h5d9h584h8dhbdh697h54bhb7h58fh5c9hd8h5cdh5c7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8hk2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-d5cc45bbc-s556r_openstack(98e9f297-bf26-40be-8ad6-281852730569): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 08:25:30 crc kubenswrapper[5127]: E0201 08:25:30.257051 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" podUID="98e9f297-bf26-40be-8ad6-281852730569" Feb 01 08:25:31 crc kubenswrapper[5127]: I0201 08:25:31.130946 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d42fff15-1ed7-468f-bf75-609929079667","Type":"ContainerStarted","Data":"f74abb91e8d05634960a4dd45ee499e130c25d806e3c118f9bc3374a3a468692"} Feb 01 08:25:31 crc kubenswrapper[5127]: E0201 08:25:31.132843 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-rabbitmq:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" Feb 01 08:25:31 crc kubenswrapper[5127]: E0201 08:25:31.132930 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" podUID="98e9f297-bf26-40be-8ad6-281852730569" Feb 01 08:25:31 crc kubenswrapper[5127]: E0201 08:25:31.133118 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" podUID="ca9690af-3744-442b-bae6-7662e965149e" Feb 01 08:25:35 crc kubenswrapper[5127]: I0201 08:25:35.158682 5127 generic.go:334] "Generic (PLEG): container finished" podID="d42fff15-1ed7-468f-bf75-609929079667" containerID="f74abb91e8d05634960a4dd45ee499e130c25d806e3c118f9bc3374a3a468692" exitCode=0 Feb 01 08:25:35 crc kubenswrapper[5127]: I0201 08:25:35.158826 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d42fff15-1ed7-468f-bf75-609929079667","Type":"ContainerDied","Data":"f74abb91e8d05634960a4dd45ee499e130c25d806e3c118f9bc3374a3a468692"} Feb 01 08:25:36 crc kubenswrapper[5127]: I0201 08:25:36.171154 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d42fff15-1ed7-468f-bf75-609929079667","Type":"ContainerStarted","Data":"f3e752a091ba91724cec94fd4c97f50a9f72b28dd2ed9415c2253c1a2cbe285e"} Feb 01 08:25:36 crc kubenswrapper[5127]: I0201 08:25:36.212584 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.380063256 podStartE2EDuration="29.212546384s" podCreationTimestamp="2026-02-01 08:25:07 +0000 UTC" firstStartedPulling="2026-02-01 08:25:09.40574413 +0000 UTC m=+5859.891646493" lastFinishedPulling="2026-02-01 08:25:30.238227258 +0000 UTC m=+5880.724129621" observedRunningTime="2026-02-01 08:25:36.204027256 +0000 UTC m=+5886.689929689" watchObservedRunningTime="2026-02-01 08:25:36.212546384 +0000 UTC m=+5886.698448787" Feb 01 08:25:38 crc kubenswrapper[5127]: I0201 08:25:38.977577 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:38 crc kubenswrapper[5127]: I0201 08:25:38.978081 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:42 crc kubenswrapper[5127]: I0201 08:25:42.219913 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"82cefe36-3a26-4a7e-add3-b445cb590fe5","Type":"ContainerStarted","Data":"d7286559287bdf4874c458c0cdd2e558c81e6a9eecc08fe041dd4885f66e482e"} Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.232335 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35805b39-1109-45f6-a3eb-41804335ad1a","Type":"ContainerStarted","Data":"53067efc201c5cf14c26802f3eff339c16637b0f4ddba4a1ae77149c5b65d251"} Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.234035 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"2bc6b5e9-8c7b-4144-b943-e57514d1f11f","Type":"ContainerStarted","Data":"0fe2a3db9a72448eb9234c46e4143b9a310953a469df9faf791efd2bf4203ab1"} Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.234847 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.235675 5127 generic.go:334] "Generic (PLEG): container finished" podID="98e9f297-bf26-40be-8ad6-281852730569" containerID="04c22418fcdc452fdd928cd2113c1c1d57e83520c4db833b7a172d004af3f2f5" exitCode=0 Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.235833 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" event={"ID":"98e9f297-bf26-40be-8ad6-281852730569","Type":"ContainerDied","Data":"04c22418fcdc452fdd928cd2113c1c1d57e83520c4db833b7a172d004af3f2f5"} Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.260662 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.1352115720000002 podStartE2EDuration="36.260641125s" podCreationTimestamp="2026-02-01 08:25:07 +0000 UTC" firstStartedPulling="2026-02-01 08:25:08.369201794 +0000 UTC m=+5858.855104157" lastFinishedPulling="2026-02-01 08:25:42.494631347 +0000 UTC m=+5892.980533710" observedRunningTime="2026-02-01 08:25:43.25634305 +0000 UTC m=+5893.742245433" watchObservedRunningTime="2026-02-01 08:25:43.260641125 +0000 UTC m=+5893.746543498" Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.641637 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:43 crc kubenswrapper[5127]: I0201 08:25:43.766491 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 01 08:25:44 crc kubenswrapper[5127]: I0201 08:25:44.258703 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b54d8b4c-18e6-4d64-ae91-4434f3eb3552","Type":"ContainerStarted","Data":"8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2"} Feb 01 08:25:44 crc kubenswrapper[5127]: I0201 08:25:44.261954 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" event={"ID":"98e9f297-bf26-40be-8ad6-281852730569","Type":"ContainerStarted","Data":"6bb87abcdea6b045b58aa2a6d2b524b65577a9c446772269e91c56253472d38e"} Feb 01 08:25:44 crc kubenswrapper[5127]: I0201 08:25:44.311657 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" podStartSLOduration=3.090302456 podStartE2EDuration="40.311632589s" podCreationTimestamp="2026-02-01 08:25:04 +0000 UTC" firstStartedPulling="2026-02-01 08:25:05.270656263 +0000 UTC m=+5855.756558626" lastFinishedPulling="2026-02-01 08:25:42.491986396 +0000 UTC m=+5892.977888759" observedRunningTime="2026-02-01 08:25:44.304657672 +0000 UTC m=+5894.790560045" watchObservedRunningTime="2026-02-01 08:25:44.311632589 +0000 UTC m=+5894.797534962" Feb 01 08:25:44 crc kubenswrapper[5127]: I0201 08:25:44.855760 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:45 crc kubenswrapper[5127]: I0201 08:25:45.276345 5127 generic.go:334] "Generic (PLEG): container finished" podID="82cefe36-3a26-4a7e-add3-b445cb590fe5" containerID="d7286559287bdf4874c458c0cdd2e558c81e6a9eecc08fe041dd4885f66e482e" exitCode=0 Feb 01 08:25:45 crc kubenswrapper[5127]: I0201 08:25:45.279159 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"82cefe36-3a26-4a7e-add3-b445cb590fe5","Type":"ContainerDied","Data":"d7286559287bdf4874c458c0cdd2e558c81e6a9eecc08fe041dd4885f66e482e"} Feb 01 08:25:46 crc kubenswrapper[5127]: I0201 08:25:46.288804 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"82cefe36-3a26-4a7e-add3-b445cb590fe5","Type":"ContainerStarted","Data":"4cf8bdebeee57e28464a69842a4339a3f53800fd08786ea1f609691ec9094ec6"} Feb 01 08:25:46 crc kubenswrapper[5127]: I0201 08:25:46.293342 5127 generic.go:334] "Generic (PLEG): container finished" podID="ca9690af-3744-442b-bae6-7662e965149e" containerID="6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9" exitCode=0 Feb 01 08:25:46 crc kubenswrapper[5127]: I0201 08:25:46.293382 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" event={"ID":"ca9690af-3744-442b-bae6-7662e965149e","Type":"ContainerDied","Data":"6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9"} Feb 01 08:25:46 crc kubenswrapper[5127]: I0201 08:25:46.336987 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371996.517809 podStartE2EDuration="40.33696656s" podCreationTimestamp="2026-02-01 08:25:06 +0000 UTC" firstStartedPulling="2026-02-01 08:25:07.931044911 +0000 UTC m=+5858.416947274" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:25:46.329367886 +0000 UTC m=+5896.815270249" watchObservedRunningTime="2026-02-01 08:25:46.33696656 +0000 UTC m=+5896.822868923" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.304362 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" event={"ID":"ca9690af-3744-442b-bae6-7662e965149e","Type":"ContainerStarted","Data":"1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f"} Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.304633 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.327883 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" podStartSLOduration=-9223371993.526915 podStartE2EDuration="43.327861342s" podCreationTimestamp="2026-02-01 08:25:04 +0000 UTC" firstStartedPulling="2026-02-01 08:25:05.14376037 +0000 UTC m=+5855.629662733" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:25:47.324984415 +0000 UTC m=+5897.810886778" watchObservedRunningTime="2026-02-01 08:25:47.327861342 +0000 UTC m=+5897.813763705" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.399161 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.399913 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.653070 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c4xng"] Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.654525 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.657352 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.663457 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c4xng"] Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.713208 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/172ff7f9-553a-4125-89e9-4a1082f27994-operator-scripts\") pod \"root-account-create-update-c4xng\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.713328 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8v8r\" (UniqueName: \"kubernetes.io/projected/172ff7f9-553a-4125-89e9-4a1082f27994-kube-api-access-h8v8r\") pod \"root-account-create-update-c4xng\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.815511 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8v8r\" (UniqueName: \"kubernetes.io/projected/172ff7f9-553a-4125-89e9-4a1082f27994-kube-api-access-h8v8r\") pod \"root-account-create-update-c4xng\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.815681 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/172ff7f9-553a-4125-89e9-4a1082f27994-operator-scripts\") pod \"root-account-create-update-c4xng\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.816859 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/172ff7f9-553a-4125-89e9-4a1082f27994-operator-scripts\") pod \"root-account-create-update-c4xng\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.835462 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8v8r\" (UniqueName: \"kubernetes.io/projected/172ff7f9-553a-4125-89e9-4a1082f27994-kube-api-access-h8v8r\") pod \"root-account-create-update-c4xng\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.853654 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 01 08:25:47 crc kubenswrapper[5127]: I0201 08:25:47.972149 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:48 crc kubenswrapper[5127]: I0201 08:25:48.436515 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c4xng"] Feb 01 08:25:48 crc kubenswrapper[5127]: W0201 08:25:48.442915 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod172ff7f9_553a_4125_89e9_4a1082f27994.slice/crio-a75a10de82227cdbf2f996e22cd4fbc3d04b950dfcbb55cf06802982f2a0f5da WatchSource:0}: Error finding container a75a10de82227cdbf2f996e22cd4fbc3d04b950dfcbb55cf06802982f2a0f5da: Status 404 returned error can't find the container with id a75a10de82227cdbf2f996e22cd4fbc3d04b950dfcbb55cf06802982f2a0f5da Feb 01 08:25:49 crc kubenswrapper[5127]: I0201 08:25:49.321374 5127 generic.go:334] "Generic (PLEG): container finished" podID="172ff7f9-553a-4125-89e9-4a1082f27994" containerID="40f8c3eaeef4f01d60f6e71ca05593645fde76f964d2077b56989684c7973158" exitCode=0 Feb 01 08:25:49 crc kubenswrapper[5127]: I0201 08:25:49.321507 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c4xng" event={"ID":"172ff7f9-553a-4125-89e9-4a1082f27994","Type":"ContainerDied","Data":"40f8c3eaeef4f01d60f6e71ca05593645fde76f964d2077b56989684c7973158"} Feb 01 08:25:49 crc kubenswrapper[5127]: I0201 08:25:49.321742 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c4xng" event={"ID":"172ff7f9-553a-4125-89e9-4a1082f27994","Type":"ContainerStarted","Data":"a75a10de82227cdbf2f996e22cd4fbc3d04b950dfcbb55cf06802982f2a0f5da"} Feb 01 08:25:49 crc kubenswrapper[5127]: I0201 08:25:49.856750 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:25:49 crc kubenswrapper[5127]: I0201 08:25:49.917172 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65fd7d585f-kxdz8"] Feb 01 08:25:49 crc kubenswrapper[5127]: I0201 08:25:49.917463 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" podUID="ca9690af-3744-442b-bae6-7662e965149e" containerName="dnsmasq-dns" containerID="cri-o://1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f" gracePeriod=10 Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.312913 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.334381 5127 generic.go:334] "Generic (PLEG): container finished" podID="ca9690af-3744-442b-bae6-7662e965149e" containerID="1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f" exitCode=0 Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.334813 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.335400 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" event={"ID":"ca9690af-3744-442b-bae6-7662e965149e","Type":"ContainerDied","Data":"1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f"} Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.335466 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd7d585f-kxdz8" event={"ID":"ca9690af-3744-442b-bae6-7662e965149e","Type":"ContainerDied","Data":"727507f09341033f1945a4b383c6829b2cc26592421b625139fd58030d2c2ea2"} Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.335503 5127 scope.go:117] "RemoveContainer" containerID="1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.351539 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7z46\" (UniqueName: \"kubernetes.io/projected/ca9690af-3744-442b-bae6-7662e965149e-kube-api-access-d7z46\") pod \"ca9690af-3744-442b-bae6-7662e965149e\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.351889 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-dns-svc\") pod \"ca9690af-3744-442b-bae6-7662e965149e\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.352050 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-config\") pod \"ca9690af-3744-442b-bae6-7662e965149e\" (UID: \"ca9690af-3744-442b-bae6-7662e965149e\") " Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.365836 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca9690af-3744-442b-bae6-7662e965149e-kube-api-access-d7z46" (OuterVolumeSpecName: "kube-api-access-d7z46") pod "ca9690af-3744-442b-bae6-7662e965149e" (UID: "ca9690af-3744-442b-bae6-7662e965149e"). InnerVolumeSpecName "kube-api-access-d7z46". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.411893 5127 scope.go:117] "RemoveContainer" containerID="6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.417468 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca9690af-3744-442b-bae6-7662e965149e" (UID: "ca9690af-3744-442b-bae6-7662e965149e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.423030 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-config" (OuterVolumeSpecName: "config") pod "ca9690af-3744-442b-bae6-7662e965149e" (UID: "ca9690af-3744-442b-bae6-7662e965149e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.436296 5127 scope.go:117] "RemoveContainer" containerID="1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f" Feb 01 08:25:50 crc kubenswrapper[5127]: E0201 08:25:50.436847 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f\": container with ID starting with 1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f not found: ID does not exist" containerID="1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.436918 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f"} err="failed to get container status \"1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f\": rpc error: code = NotFound desc = could not find container \"1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f\": container with ID starting with 1c904242806910d98b3a412eb4566c615ce43a98f8d51b08a07d5aa318be9c5f not found: ID does not exist" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.436952 5127 scope.go:117] "RemoveContainer" containerID="6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9" Feb 01 08:25:50 crc kubenswrapper[5127]: E0201 08:25:50.440753 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9\": container with ID starting with 6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9 not found: ID does not exist" containerID="6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.440808 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9"} err="failed to get container status \"6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9\": rpc error: code = NotFound desc = could not find container \"6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9\": container with ID starting with 6a2fc98b4d5ea78c5209b3698eaf3f69b198745207992e30101a003dee39a1e9 not found: ID does not exist" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.454736 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.454781 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca9690af-3744-442b-bae6-7662e965149e-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.454798 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7z46\" (UniqueName: \"kubernetes.io/projected/ca9690af-3744-442b-bae6-7662e965149e-kube-api-access-d7z46\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.599824 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.658836 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8v8r\" (UniqueName: \"kubernetes.io/projected/172ff7f9-553a-4125-89e9-4a1082f27994-kube-api-access-h8v8r\") pod \"172ff7f9-553a-4125-89e9-4a1082f27994\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.658918 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/172ff7f9-553a-4125-89e9-4a1082f27994-operator-scripts\") pod \"172ff7f9-553a-4125-89e9-4a1082f27994\" (UID: \"172ff7f9-553a-4125-89e9-4a1082f27994\") " Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.659838 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/172ff7f9-553a-4125-89e9-4a1082f27994-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "172ff7f9-553a-4125-89e9-4a1082f27994" (UID: "172ff7f9-553a-4125-89e9-4a1082f27994"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.662069 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/172ff7f9-553a-4125-89e9-4a1082f27994-kube-api-access-h8v8r" (OuterVolumeSpecName: "kube-api-access-h8v8r") pod "172ff7f9-553a-4125-89e9-4a1082f27994" (UID: "172ff7f9-553a-4125-89e9-4a1082f27994"). InnerVolumeSpecName "kube-api-access-h8v8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.670526 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65fd7d585f-kxdz8"] Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.677636 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65fd7d585f-kxdz8"] Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.767573 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8v8r\" (UniqueName: \"kubernetes.io/projected/172ff7f9-553a-4125-89e9-4a1082f27994-kube-api-access-h8v8r\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:50 crc kubenswrapper[5127]: I0201 08:25:50.767663 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/172ff7f9-553a-4125-89e9-4a1082f27994-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:25:51 crc kubenswrapper[5127]: I0201 08:25:51.347733 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c4xng" event={"ID":"172ff7f9-553a-4125-89e9-4a1082f27994","Type":"ContainerDied","Data":"a75a10de82227cdbf2f996e22cd4fbc3d04b950dfcbb55cf06802982f2a0f5da"} Feb 01 08:25:51 crc kubenswrapper[5127]: I0201 08:25:51.347796 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a75a10de82227cdbf2f996e22cd4fbc3d04b950dfcbb55cf06802982f2a0f5da" Feb 01 08:25:51 crc kubenswrapper[5127]: I0201 08:25:51.348688 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c4xng" Feb 01 08:25:51 crc kubenswrapper[5127]: E0201 08:25:51.515782 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod172ff7f9_553a_4125_89e9_4a1082f27994.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:25:52 crc kubenswrapper[5127]: I0201 08:25:52.246549 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca9690af-3744-442b-bae6-7662e965149e" path="/var/lib/kubelet/pods/ca9690af-3744-442b-bae6-7662e965149e/volumes" Feb 01 08:25:53 crc kubenswrapper[5127]: I0201 08:25:53.914473 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 01 08:25:53 crc kubenswrapper[5127]: I0201 08:25:53.995501 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.011237 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c4xng"] Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.018172 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c4xng"] Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.091940 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-87k97"] Feb 01 08:25:56 crc kubenswrapper[5127]: E0201 08:25:56.092304 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9690af-3744-442b-bae6-7662e965149e" containerName="dnsmasq-dns" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.092350 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9690af-3744-442b-bae6-7662e965149e" containerName="dnsmasq-dns" Feb 01 08:25:56 crc kubenswrapper[5127]: E0201 08:25:56.092377 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca9690af-3744-442b-bae6-7662e965149e" containerName="init" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.092385 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca9690af-3744-442b-bae6-7662e965149e" containerName="init" Feb 01 08:25:56 crc kubenswrapper[5127]: E0201 08:25:56.092411 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="172ff7f9-553a-4125-89e9-4a1082f27994" containerName="mariadb-account-create-update" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.092419 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="172ff7f9-553a-4125-89e9-4a1082f27994" containerName="mariadb-account-create-update" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.092611 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca9690af-3744-442b-bae6-7662e965149e" containerName="dnsmasq-dns" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.092634 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="172ff7f9-553a-4125-89e9-4a1082f27994" containerName="mariadb-account-create-update" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.093283 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.102483 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.103407 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-87k97"] Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.245444 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="172ff7f9-553a-4125-89e9-4a1082f27994" path="/var/lib/kubelet/pods/172ff7f9-553a-4125-89e9-4a1082f27994/volumes" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.258810 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8th9\" (UniqueName: \"kubernetes.io/projected/5fffed9b-6bbc-4acc-b2a2-222991b5b813-kube-api-access-p8th9\") pod \"root-account-create-update-87k97\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.259009 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fffed9b-6bbc-4acc-b2a2-222991b5b813-operator-scripts\") pod \"root-account-create-update-87k97\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.360906 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fffed9b-6bbc-4acc-b2a2-222991b5b813-operator-scripts\") pod \"root-account-create-update-87k97\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.361185 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8th9\" (UniqueName: \"kubernetes.io/projected/5fffed9b-6bbc-4acc-b2a2-222991b5b813-kube-api-access-p8th9\") pod \"root-account-create-update-87k97\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.365383 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fffed9b-6bbc-4acc-b2a2-222991b5b813-operator-scripts\") pod \"root-account-create-update-87k97\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.407250 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8th9\" (UniqueName: \"kubernetes.io/projected/5fffed9b-6bbc-4acc-b2a2-222991b5b813-kube-api-access-p8th9\") pod \"root-account-create-update-87k97\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.425160 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87k97" Feb 01 08:25:56 crc kubenswrapper[5127]: I0201 08:25:56.880733 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-87k97"] Feb 01 08:25:57 crc kubenswrapper[5127]: I0201 08:25:57.408440 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-87k97" event={"ID":"5fffed9b-6bbc-4acc-b2a2-222991b5b813","Type":"ContainerStarted","Data":"7a4310277a75f2d4aa63e17e2c467bace1f486d0229a6eb32fd1b6dacc8edb97"} Feb 01 08:25:57 crc kubenswrapper[5127]: I0201 08:25:57.408493 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-87k97" event={"ID":"5fffed9b-6bbc-4acc-b2a2-222991b5b813","Type":"ContainerStarted","Data":"3a923307fe99137a74ca125524e8d7d9705f7ce2cb38fcbe78437d5b9044c394"} Feb 01 08:25:57 crc kubenswrapper[5127]: I0201 08:25:57.430290 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-87k97" podStartSLOduration=1.430254607 podStartE2EDuration="1.430254607s" podCreationTimestamp="2026-02-01 08:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:25:57.426324542 +0000 UTC m=+5907.912226935" watchObservedRunningTime="2026-02-01 08:25:57.430254607 +0000 UTC m=+5907.916156970" Feb 01 08:25:59 crc kubenswrapper[5127]: I0201 08:25:59.435337 5127 generic.go:334] "Generic (PLEG): container finished" podID="5fffed9b-6bbc-4acc-b2a2-222991b5b813" containerID="7a4310277a75f2d4aa63e17e2c467bace1f486d0229a6eb32fd1b6dacc8edb97" exitCode=0 Feb 01 08:25:59 crc kubenswrapper[5127]: I0201 08:25:59.435816 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-87k97" event={"ID":"5fffed9b-6bbc-4acc-b2a2-222991b5b813","Type":"ContainerDied","Data":"7a4310277a75f2d4aa63e17e2c467bace1f486d0229a6eb32fd1b6dacc8edb97"} Feb 01 08:26:00 crc kubenswrapper[5127]: I0201 08:26:00.784266 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87k97" Feb 01 08:26:00 crc kubenswrapper[5127]: I0201 08:26:00.939490 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fffed9b-6bbc-4acc-b2a2-222991b5b813-operator-scripts\") pod \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " Feb 01 08:26:00 crc kubenswrapper[5127]: I0201 08:26:00.939666 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8th9\" (UniqueName: \"kubernetes.io/projected/5fffed9b-6bbc-4acc-b2a2-222991b5b813-kube-api-access-p8th9\") pod \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\" (UID: \"5fffed9b-6bbc-4acc-b2a2-222991b5b813\") " Feb 01 08:26:00 crc kubenswrapper[5127]: I0201 08:26:00.941450 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fffed9b-6bbc-4acc-b2a2-222991b5b813-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fffed9b-6bbc-4acc-b2a2-222991b5b813" (UID: "5fffed9b-6bbc-4acc-b2a2-222991b5b813"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:26:00 crc kubenswrapper[5127]: I0201 08:26:00.964822 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fffed9b-6bbc-4acc-b2a2-222991b5b813-kube-api-access-p8th9" (OuterVolumeSpecName: "kube-api-access-p8th9") pod "5fffed9b-6bbc-4acc-b2a2-222991b5b813" (UID: "5fffed9b-6bbc-4acc-b2a2-222991b5b813"). InnerVolumeSpecName "kube-api-access-p8th9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:26:01 crc kubenswrapper[5127]: I0201 08:26:01.041632 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fffed9b-6bbc-4acc-b2a2-222991b5b813-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:01 crc kubenswrapper[5127]: I0201 08:26:01.041675 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8th9\" (UniqueName: \"kubernetes.io/projected/5fffed9b-6bbc-4acc-b2a2-222991b5b813-kube-api-access-p8th9\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:01 crc kubenswrapper[5127]: I0201 08:26:01.454571 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-87k97" event={"ID":"5fffed9b-6bbc-4acc-b2a2-222991b5b813","Type":"ContainerDied","Data":"3a923307fe99137a74ca125524e8d7d9705f7ce2cb38fcbe78437d5b9044c394"} Feb 01 08:26:01 crc kubenswrapper[5127]: I0201 08:26:01.454661 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a923307fe99137a74ca125524e8d7d9705f7ce2cb38fcbe78437d5b9044c394" Feb 01 08:26:01 crc kubenswrapper[5127]: I0201 08:26:01.454677 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-87k97" Feb 01 08:26:15 crc kubenswrapper[5127]: I0201 08:26:15.586731 5127 generic.go:334] "Generic (PLEG): container finished" podID="35805b39-1109-45f6-a3eb-41804335ad1a" containerID="53067efc201c5cf14c26802f3eff339c16637b0f4ddba4a1ae77149c5b65d251" exitCode=0 Feb 01 08:26:15 crc kubenswrapper[5127]: I0201 08:26:15.586794 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35805b39-1109-45f6-a3eb-41804335ad1a","Type":"ContainerDied","Data":"53067efc201c5cf14c26802f3eff339c16637b0f4ddba4a1ae77149c5b65d251"} Feb 01 08:26:16 crc kubenswrapper[5127]: I0201 08:26:16.597840 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35805b39-1109-45f6-a3eb-41804335ad1a","Type":"ContainerStarted","Data":"0eb04cf50558addb5c54a40d511cd1ee666d0d6916db8da744e1ff3dafde161e"} Feb 01 08:26:16 crc kubenswrapper[5127]: I0201 08:26:16.599113 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 01 08:26:16 crc kubenswrapper[5127]: I0201 08:26:16.600710 5127 generic.go:334] "Generic (PLEG): container finished" podID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerID="8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2" exitCode=0 Feb 01 08:26:16 crc kubenswrapper[5127]: I0201 08:26:16.600771 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b54d8b4c-18e6-4d64-ae91-4434f3eb3552","Type":"ContainerDied","Data":"8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2"} Feb 01 08:26:16 crc kubenswrapper[5127]: I0201 08:26:16.623080 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.60824593 podStartE2EDuration="1m12.623063851s" podCreationTimestamp="2026-02-01 08:25:04 +0000 UTC" firstStartedPulling="2026-02-01 08:25:06.44664314 +0000 UTC m=+5856.932545503" lastFinishedPulling="2026-02-01 08:25:41.461461061 +0000 UTC m=+5891.947363424" observedRunningTime="2026-02-01 08:26:16.621235053 +0000 UTC m=+5927.107137416" watchObservedRunningTime="2026-02-01 08:26:16.623063851 +0000 UTC m=+5927.108966214" Feb 01 08:26:17 crc kubenswrapper[5127]: I0201 08:26:17.608462 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b54d8b4c-18e6-4d64-ae91-4434f3eb3552","Type":"ContainerStarted","Data":"dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b"} Feb 01 08:26:17 crc kubenswrapper[5127]: I0201 08:26:17.632479 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371963.222315 podStartE2EDuration="1m13.63246138s" podCreationTimestamp="2026-02-01 08:25:04 +0000 UTC" firstStartedPulling="2026-02-01 08:25:06.887955768 +0000 UTC m=+5857.373858131" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:26:17.631235766 +0000 UTC m=+5928.117138149" watchObservedRunningTime="2026-02-01 08:26:17.63246138 +0000 UTC m=+5928.118363743" Feb 01 08:26:25 crc kubenswrapper[5127]: I0201 08:26:25.785762 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 01 08:26:26 crc kubenswrapper[5127]: I0201 08:26:26.349119 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:36 crc kubenswrapper[5127]: I0201 08:26:36.351069 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:36 crc kubenswrapper[5127]: I0201 08:26:36.740263 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:26:36 crc kubenswrapper[5127]: I0201 08:26:36.740561 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:26:38 crc kubenswrapper[5127]: I0201 08:26:38.992136 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cbb469cd9-wc4fv"] Feb 01 08:26:38 crc kubenswrapper[5127]: E0201 08:26:38.992554 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fffed9b-6bbc-4acc-b2a2-222991b5b813" containerName="mariadb-account-create-update" Feb 01 08:26:38 crc kubenswrapper[5127]: I0201 08:26:38.992572 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fffed9b-6bbc-4acc-b2a2-222991b5b813" containerName="mariadb-account-create-update" Feb 01 08:26:38 crc kubenswrapper[5127]: I0201 08:26:38.993092 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fffed9b-6bbc-4acc-b2a2-222991b5b813" containerName="mariadb-account-create-update" Feb 01 08:26:38 crc kubenswrapper[5127]: I0201 08:26:38.993962 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.008946 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cbb469cd9-wc4fv"] Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.139567 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-config\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.139672 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lxdh\" (UniqueName: \"kubernetes.io/projected/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-kube-api-access-5lxdh\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.139760 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-dns-svc\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.241284 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-dns-svc\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.241369 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-config\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.241410 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lxdh\" (UniqueName: \"kubernetes.io/projected/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-kube-api-access-5lxdh\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.242713 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-config\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.243181 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-dns-svc\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.265433 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lxdh\" (UniqueName: \"kubernetes.io/projected/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-kube-api-access-5lxdh\") pod \"dnsmasq-dns-cbb469cd9-wc4fv\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.321776 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.625274 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:26:39 crc kubenswrapper[5127]: I0201 08:26:39.818015 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cbb469cd9-wc4fv"] Feb 01 08:26:40 crc kubenswrapper[5127]: I0201 08:26:40.154173 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:26:40 crc kubenswrapper[5127]: I0201 08:26:40.790472 5127 generic.go:334] "Generic (PLEG): container finished" podID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerID="36e54476cffe04c5394ab0e8f6b3dbf258f9c6c320142132373db5cf55546582" exitCode=0 Feb 01 08:26:40 crc kubenswrapper[5127]: I0201 08:26:40.790556 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" event={"ID":"6b4b6f0f-e6e8-40a8-befa-73332751b5a1","Type":"ContainerDied","Data":"36e54476cffe04c5394ab0e8f6b3dbf258f9c6c320142132373db5cf55546582"} Feb 01 08:26:40 crc kubenswrapper[5127]: I0201 08:26:40.790753 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" event={"ID":"6b4b6f0f-e6e8-40a8-befa-73332751b5a1","Type":"ContainerStarted","Data":"284eaf68a1007be940d00d74aa173e4d5a69ac8d8ad52320d569fb2dd1f92a91"} Feb 01 08:26:41 crc kubenswrapper[5127]: I0201 08:26:41.514268 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" containerName="rabbitmq" containerID="cri-o://0eb04cf50558addb5c54a40d511cd1ee666d0d6916db8da744e1ff3dafde161e" gracePeriod=604799 Feb 01 08:26:41 crc kubenswrapper[5127]: I0201 08:26:41.799895 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" event={"ID":"6b4b6f0f-e6e8-40a8-befa-73332751b5a1","Type":"ContainerStarted","Data":"da956daa74f03cc11ce83ba491f3d1510f66b6ffa98a50c2a819e6e34b48fa6b"} Feb 01 08:26:41 crc kubenswrapper[5127]: I0201 08:26:41.800045 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:41 crc kubenswrapper[5127]: I0201 08:26:41.828377 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" podStartSLOduration=3.828345344 podStartE2EDuration="3.828345344s" podCreationTimestamp="2026-02-01 08:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:26:41.822831306 +0000 UTC m=+5952.308733709" watchObservedRunningTime="2026-02-01 08:26:41.828345344 +0000 UTC m=+5952.314247707" Feb 01 08:26:41 crc kubenswrapper[5127]: I0201 08:26:41.939142 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerName="rabbitmq" containerID="cri-o://dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b" gracePeriod=604799 Feb 01 08:26:45 crc kubenswrapper[5127]: I0201 08:26:45.780874 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.252:5672: connect: connection refused" Feb 01 08:26:46 crc kubenswrapper[5127]: I0201 08:26:46.349617 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.253:5672: connect: connection refused" Feb 01 08:26:47 crc kubenswrapper[5127]: I0201 08:26:47.857985 5127 generic.go:334] "Generic (PLEG): container finished" podID="35805b39-1109-45f6-a3eb-41804335ad1a" containerID="0eb04cf50558addb5c54a40d511cd1ee666d0d6916db8da744e1ff3dafde161e" exitCode=0 Feb 01 08:26:47 crc kubenswrapper[5127]: I0201 08:26:47.858118 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35805b39-1109-45f6-a3eb-41804335ad1a","Type":"ContainerDied","Data":"0eb04cf50558addb5c54a40d511cd1ee666d0d6916db8da744e1ff3dafde161e"} Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.377294 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490192 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35805b39-1109-45f6-a3eb-41804335ad1a-pod-info\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490261 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nqsk\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-kube-api-access-4nqsk\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490518 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35805b39-1109-45f6-a3eb-41804335ad1a-erlang-cookie-secret\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490558 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-server-conf\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490660 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-erlang-cookie\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490690 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-plugins-conf\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490726 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-plugins\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490857 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.490911 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-confd\") pod \"35805b39-1109-45f6-a3eb-41804335ad1a\" (UID: \"35805b39-1109-45f6-a3eb-41804335ad1a\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.491775 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.492202 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.492440 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.496147 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/35805b39-1109-45f6-a3eb-41804335ad1a-pod-info" (OuterVolumeSpecName: "pod-info") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.497162 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-kube-api-access-4nqsk" (OuterVolumeSpecName: "kube-api-access-4nqsk") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "kube-api-access-4nqsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.505541 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35805b39-1109-45f6-a3eb-41804335ad1a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.506351 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c" (OuterVolumeSpecName: "persistence") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.508941 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.524573 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-server-conf" (OuterVolumeSpecName: "server-conf") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592381 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592420 5127 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592431 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592462 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") on node \"crc\" " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592477 5127 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35805b39-1109-45f6-a3eb-41804335ad1a-pod-info\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592486 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nqsk\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-kube-api-access-4nqsk\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592495 5127 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35805b39-1109-45f6-a3eb-41804335ad1a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.592503 5127 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35805b39-1109-45f6-a3eb-41804335ad1a-server-conf\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.606060 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "35805b39-1109-45f6-a3eb-41804335ad1a" (UID: "35805b39-1109-45f6-a3eb-41804335ad1a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.608012 5127 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.608160 5127 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c") on node "crc" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693376 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-confd\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693434 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-plugins-conf\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693546 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-pod-info\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693595 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-erlang-cookie\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693630 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-erlang-cookie-secret\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693644 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-server-conf\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693666 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-plugins\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693775 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.693821 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7nw9\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-kube-api-access-g7nw9\") pod \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\" (UID: \"b54d8b4c-18e6-4d64-ae91-4434f3eb3552\") " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.694057 5127 reconciler_common.go:293] "Volume detached for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.694075 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35805b39-1109-45f6-a3eb-41804335ad1a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.694427 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.695015 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.695360 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.697353 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.697431 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-kube-api-access-g7nw9" (OuterVolumeSpecName: "kube-api-access-g7nw9") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "kube-api-access-g7nw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.697747 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-pod-info" (OuterVolumeSpecName: "pod-info") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.708427 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4" (OuterVolumeSpecName: "persistence") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.718789 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-server-conf" (OuterVolumeSpecName: "server-conf") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.772041 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b54d8b4c-18e6-4d64-ae91-4434f3eb3552" (UID: "b54d8b4c-18e6-4d64-ae91-4434f3eb3552"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795358 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795394 5127 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795402 5127 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-server-conf\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795411 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795447 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") on node \"crc\" " Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795460 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7nw9\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-kube-api-access-g7nw9\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795469 5127 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795477 5127 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.795485 5127 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b54d8b4c-18e6-4d64-ae91-4434f3eb3552-pod-info\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.810682 5127 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.810833 5127 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4") on node "crc" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.866238 5127 generic.go:334] "Generic (PLEG): container finished" podID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerID="dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b" exitCode=0 Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.866291 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.866288 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b54d8b4c-18e6-4d64-ae91-4434f3eb3552","Type":"ContainerDied","Data":"dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b"} Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.866333 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b54d8b4c-18e6-4d64-ae91-4434f3eb3552","Type":"ContainerDied","Data":"c7a3c50645c7f49447216d33ea9eceba29fe56a0647d43aebb99bdad839a9e0c"} Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.866349 5127 scope.go:117] "RemoveContainer" containerID="dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.869338 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35805b39-1109-45f6-a3eb-41804335ad1a","Type":"ContainerDied","Data":"0ee045a4942b41df2263008ec9e5726e190cde1b99058c6ca1cbb1948c5a210b"} Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.869410 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.899396 5127 scope.go:117] "RemoveContainer" containerID="8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.900668 5127 reconciler_common.go:293] "Volume detached for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.908683 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.919312 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.933358 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.944868 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.951504 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:26:48 crc kubenswrapper[5127]: E0201 08:26:48.951891 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" containerName="setup-container" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.951916 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" containerName="setup-container" Feb 01 08:26:48 crc kubenswrapper[5127]: E0201 08:26:48.951928 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" containerName="rabbitmq" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.951935 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" containerName="rabbitmq" Feb 01 08:26:48 crc kubenswrapper[5127]: E0201 08:26:48.951953 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerName="rabbitmq" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.951959 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerName="rabbitmq" Feb 01 08:26:48 crc kubenswrapper[5127]: E0201 08:26:48.951974 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerName="setup-container" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.951979 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerName="setup-container" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.952122 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" containerName="rabbitmq" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.952133 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" containerName="rabbitmq" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.952916 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.955045 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vgk6b" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.955164 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.956923 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.957224 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.957406 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.959092 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.961025 5127 scope.go:117] "RemoveContainer" containerID="dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b" Feb 01 08:26:48 crc kubenswrapper[5127]: E0201 08:26:48.961504 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b\": container with ID starting with dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b not found: ID does not exist" containerID="dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.961540 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b"} err="failed to get container status \"dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b\": rpc error: code = NotFound desc = could not find container \"dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b\": container with ID starting with dddabe5ea0e12e88288a52e4c3fc9bcc452e633ce314898c4e60bad60627707b not found: ID does not exist" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.961564 5127 scope.go:117] "RemoveContainer" containerID="8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2" Feb 01 08:26:48 crc kubenswrapper[5127]: E0201 08:26:48.962019 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2\": container with ID starting with 8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2 not found: ID does not exist" containerID="8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.962044 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2"} err="failed to get container status \"8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2\": rpc error: code = NotFound desc = could not find container \"8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2\": container with ID starting with 8c4de4e75704d9e9477b5e61ef6fc9d20d057f5aef11fd9333d5d9ab1db7a6a2 not found: ID does not exist" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.962062 5127 scope.go:117] "RemoveContainer" containerID="0eb04cf50558addb5c54a40d511cd1ee666d0d6916db8da744e1ff3dafde161e" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.966000 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.967420 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.973337 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.973745 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-9ztnb" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.973990 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.974173 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.974396 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.984452 5127 scope.go:117] "RemoveContainer" containerID="53067efc201c5cf14c26802f3eff339c16637b0f4ddba4a1ae77149c5b65d251" Feb 01 08:26:48 crc kubenswrapper[5127]: I0201 08:26:48.992178 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.104960 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105030 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4052af64-63c0-4e94-bcb9-96463c2e98ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105066 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bb04d8b-7336-442e-ab61-a9a207787027-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105093 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bb04d8b-7336-442e-ab61-a9a207787027-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105118 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zhjv\" (UniqueName: \"kubernetes.io/projected/6bb04d8b-7336-442e-ab61-a9a207787027-kube-api-access-4zhjv\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105149 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105179 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6n5g\" (UniqueName: \"kubernetes.io/projected/4052af64-63c0-4e94-bcb9-96463c2e98ce-kube-api-access-h6n5g\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105207 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105230 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105256 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105283 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105309 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105330 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4052af64-63c0-4e94-bcb9-96463c2e98ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105354 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bb04d8b-7336-442e-ab61-a9a207787027-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105380 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bb04d8b-7336-442e-ab61-a9a207787027-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105401 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4052af64-63c0-4e94-bcb9-96463c2e98ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105455 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4052af64-63c0-4e94-bcb9-96463c2e98ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.105481 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206413 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206475 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4052af64-63c0-4e94-bcb9-96463c2e98ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206512 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bb04d8b-7336-442e-ab61-a9a207787027-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206535 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bb04d8b-7336-442e-ab61-a9a207787027-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206559 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zhjv\" (UniqueName: \"kubernetes.io/projected/6bb04d8b-7336-442e-ab61-a9a207787027-kube-api-access-4zhjv\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206598 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206636 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6n5g\" (UniqueName: \"kubernetes.io/projected/4052af64-63c0-4e94-bcb9-96463c2e98ce-kube-api-access-h6n5g\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206661 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206682 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206704 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206730 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206760 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206781 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4052af64-63c0-4e94-bcb9-96463c2e98ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206805 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bb04d8b-7336-442e-ab61-a9a207787027-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206829 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bb04d8b-7336-442e-ab61-a9a207787027-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206852 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4052af64-63c0-4e94-bcb9-96463c2e98ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206905 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4052af64-63c0-4e94-bcb9-96463c2e98ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.206929 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.208502 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4052af64-63c0-4e94-bcb9-96463c2e98ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.208776 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4052af64-63c0-4e94-bcb9-96463c2e98ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.208821 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bb04d8b-7336-442e-ab61-a9a207787027-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.208845 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.209064 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.209497 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bb04d8b-7336-442e-ab61-a9a207787027-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.211180 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.211757 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.211780 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88e60ebfd736eab7c95666109ce6af3bc7b62ad4c675fb78d10cca87ffe539bd/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.212000 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.212864 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bb04d8b-7336-442e-ab61-a9a207787027-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.213108 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bb04d8b-7336-442e-ab61-a9a207787027-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.215032 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4052af64-63c0-4e94-bcb9-96463c2e98ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.215458 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bb04d8b-7336-442e-ab61-a9a207787027-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.215976 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.216075 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9cecaf4f1d642cec9adaf1e51c239b9030e1fd6747f1ef12f89f749c4cf42182/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.218921 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4052af64-63c0-4e94-bcb9-96463c2e98ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.223525 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4052af64-63c0-4e94-bcb9-96463c2e98ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.226836 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6n5g\" (UniqueName: \"kubernetes.io/projected/4052af64-63c0-4e94-bcb9-96463c2e98ce-kube-api-access-h6n5g\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.228051 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zhjv\" (UniqueName: \"kubernetes.io/projected/6bb04d8b-7336-442e-ab61-a9a207787027-kube-api-access-4zhjv\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.248608 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6893dbde-c741-4097-9a89-d800cb3bb3b4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4052af64-63c0-4e94-bcb9-96463c2e98ce\") " pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.256724 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-62da7e11-c19f-4b82-9ad9-dd2b6a08b88c\") pod \"rabbitmq-server-0\" (UID: \"6bb04d8b-7336-442e-ab61-a9a207787027\") " pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.292501 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.293844 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.323828 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.386614 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5cc45bbc-s556r"] Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.387292 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" podUID="98e9f297-bf26-40be-8ad6-281852730569" containerName="dnsmasq-dns" containerID="cri-o://6bb87abcdea6b045b58aa2a6d2b524b65577a9c446772269e91c56253472d38e" gracePeriod=10 Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.794499 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.866312 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.878254 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4052af64-63c0-4e94-bcb9-96463c2e98ce","Type":"ContainerStarted","Data":"0e0af6dd2bd633f9516404f8d3fb12df6a3835be14cf3ca2ba2ea1c282c6f307"} Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.881424 5127 generic.go:334] "Generic (PLEG): container finished" podID="98e9f297-bf26-40be-8ad6-281852730569" containerID="6bb87abcdea6b045b58aa2a6d2b524b65577a9c446772269e91c56253472d38e" exitCode=0 Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.881462 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" event={"ID":"98e9f297-bf26-40be-8ad6-281852730569","Type":"ContainerDied","Data":"6bb87abcdea6b045b58aa2a6d2b524b65577a9c446772269e91c56253472d38e"} Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.881477 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" event={"ID":"98e9f297-bf26-40be-8ad6-281852730569","Type":"ContainerDied","Data":"87dd9c4d48242748a2e89621bb50fb4c2c47d10949335dbaf466a4d2bcfee74f"} Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.881486 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87dd9c4d48242748a2e89621bb50fb4c2c47d10949335dbaf466a4d2bcfee74f" Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.884131 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bb04d8b-7336-442e-ab61-a9a207787027","Type":"ContainerStarted","Data":"881c8e7994f8d61465b3b720d1aeef5bf797a72ddbc83f074ce2681bb75cc1c2"} Feb 01 08:26:49 crc kubenswrapper[5127]: I0201 08:26:49.946614 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.126867 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-config\") pod \"98e9f297-bf26-40be-8ad6-281852730569\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.127000 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-dns-svc\") pod \"98e9f297-bf26-40be-8ad6-281852730569\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.127071 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hk2r\" (UniqueName: \"kubernetes.io/projected/98e9f297-bf26-40be-8ad6-281852730569-kube-api-access-8hk2r\") pod \"98e9f297-bf26-40be-8ad6-281852730569\" (UID: \"98e9f297-bf26-40be-8ad6-281852730569\") " Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.131475 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e9f297-bf26-40be-8ad6-281852730569-kube-api-access-8hk2r" (OuterVolumeSpecName: "kube-api-access-8hk2r") pod "98e9f297-bf26-40be-8ad6-281852730569" (UID: "98e9f297-bf26-40be-8ad6-281852730569"). InnerVolumeSpecName "kube-api-access-8hk2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.178574 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98e9f297-bf26-40be-8ad6-281852730569" (UID: "98e9f297-bf26-40be-8ad6-281852730569"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.184424 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-config" (OuterVolumeSpecName: "config") pod "98e9f297-bf26-40be-8ad6-281852730569" (UID: "98e9f297-bf26-40be-8ad6-281852730569"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.228609 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.228658 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hk2r\" (UniqueName: \"kubernetes.io/projected/98e9f297-bf26-40be-8ad6-281852730569-kube-api-access-8hk2r\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.228673 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98e9f297-bf26-40be-8ad6-281852730569-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.246418 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35805b39-1109-45f6-a3eb-41804335ad1a" path="/var/lib/kubelet/pods/35805b39-1109-45f6-a3eb-41804335ad1a/volumes" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.247760 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54d8b4c-18e6-4d64-ae91-4434f3eb3552" path="/var/lib/kubelet/pods/b54d8b4c-18e6-4d64-ae91-4434f3eb3552/volumes" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.893941 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.918805 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d5cc45bbc-s556r"] Feb 01 08:26:50 crc kubenswrapper[5127]: I0201 08:26:50.925213 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d5cc45bbc-s556r"] Feb 01 08:26:51 crc kubenswrapper[5127]: I0201 08:26:51.912914 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4052af64-63c0-4e94-bcb9-96463c2e98ce","Type":"ContainerStarted","Data":"c9935bf7151ecb006e0af71cd8c2c0d4ecaba3d35077c4760c7dffbb24c610fb"} Feb 01 08:26:51 crc kubenswrapper[5127]: I0201 08:26:51.921558 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bb04d8b-7336-442e-ab61-a9a207787027","Type":"ContainerStarted","Data":"5ab801c3ecb1d452807f95431e0dfed464700291729e9f98fd942a86f36f8051"} Feb 01 08:26:52 crc kubenswrapper[5127]: I0201 08:26:52.245065 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98e9f297-bf26-40be-8ad6-281852730569" path="/var/lib/kubelet/pods/98e9f297-bf26-40be-8ad6-281852730569/volumes" Feb 01 08:26:54 crc kubenswrapper[5127]: I0201 08:26:54.856027 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d5cc45bbc-s556r" podUID="98e9f297-bf26-40be-8ad6-281852730569" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.251:5353: i/o timeout" Feb 01 08:27:02 crc kubenswrapper[5127]: I0201 08:27:02.949031 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wwfpl"] Feb 01 08:27:02 crc kubenswrapper[5127]: E0201 08:27:02.950865 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e9f297-bf26-40be-8ad6-281852730569" containerName="init" Feb 01 08:27:02 crc kubenswrapper[5127]: I0201 08:27:02.950961 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e9f297-bf26-40be-8ad6-281852730569" containerName="init" Feb 01 08:27:02 crc kubenswrapper[5127]: E0201 08:27:02.951032 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e9f297-bf26-40be-8ad6-281852730569" containerName="dnsmasq-dns" Feb 01 08:27:02 crc kubenswrapper[5127]: I0201 08:27:02.951091 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e9f297-bf26-40be-8ad6-281852730569" containerName="dnsmasq-dns" Feb 01 08:27:02 crc kubenswrapper[5127]: I0201 08:27:02.951359 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e9f297-bf26-40be-8ad6-281852730569" containerName="dnsmasq-dns" Feb 01 08:27:02 crc kubenswrapper[5127]: I0201 08:27:02.952506 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:02 crc kubenswrapper[5127]: I0201 08:27:02.966168 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wwfpl"] Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.050694 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sm4t\" (UniqueName: \"kubernetes.io/projected/9fa4108f-99a0-4b74-af01-c75369988b49-kube-api-access-2sm4t\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.050778 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-utilities\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.050889 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-catalog-content\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.152466 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-catalog-content\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.153315 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sm4t\" (UniqueName: \"kubernetes.io/projected/9fa4108f-99a0-4b74-af01-c75369988b49-kube-api-access-2sm4t\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.153608 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-utilities\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.154223 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-catalog-content\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.154404 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-utilities\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.177989 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sm4t\" (UniqueName: \"kubernetes.io/projected/9fa4108f-99a0-4b74-af01-c75369988b49-kube-api-access-2sm4t\") pod \"community-operators-wwfpl\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.290780 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:03 crc kubenswrapper[5127]: I0201 08:27:03.889282 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wwfpl"] Feb 01 08:27:03 crc kubenswrapper[5127]: W0201 08:27:03.896158 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fa4108f_99a0_4b74_af01_c75369988b49.slice/crio-7e1a4aa6286013e92d9566892564caccbf629e1ae2790849826f7424d895b411 WatchSource:0}: Error finding container 7e1a4aa6286013e92d9566892564caccbf629e1ae2790849826f7424d895b411: Status 404 returned error can't find the container with id 7e1a4aa6286013e92d9566892564caccbf629e1ae2790849826f7424d895b411 Feb 01 08:27:04 crc kubenswrapper[5127]: I0201 08:27:04.037887 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwfpl" event={"ID":"9fa4108f-99a0-4b74-af01-c75369988b49","Type":"ContainerStarted","Data":"7e1a4aa6286013e92d9566892564caccbf629e1ae2790849826f7424d895b411"} Feb 01 08:27:05 crc kubenswrapper[5127]: I0201 08:27:05.049195 5127 generic.go:334] "Generic (PLEG): container finished" podID="9fa4108f-99a0-4b74-af01-c75369988b49" containerID="52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb" exitCode=0 Feb 01 08:27:05 crc kubenswrapper[5127]: I0201 08:27:05.049302 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwfpl" event={"ID":"9fa4108f-99a0-4b74-af01-c75369988b49","Type":"ContainerDied","Data":"52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb"} Feb 01 08:27:06 crc kubenswrapper[5127]: I0201 08:27:06.057648 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwfpl" event={"ID":"9fa4108f-99a0-4b74-af01-c75369988b49","Type":"ContainerStarted","Data":"bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17"} Feb 01 08:27:06 crc kubenswrapper[5127]: I0201 08:27:06.741097 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:27:06 crc kubenswrapper[5127]: I0201 08:27:06.741169 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:27:07 crc kubenswrapper[5127]: I0201 08:27:07.067700 5127 generic.go:334] "Generic (PLEG): container finished" podID="9fa4108f-99a0-4b74-af01-c75369988b49" containerID="bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17" exitCode=0 Feb 01 08:27:07 crc kubenswrapper[5127]: I0201 08:27:07.067748 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwfpl" event={"ID":"9fa4108f-99a0-4b74-af01-c75369988b49","Type":"ContainerDied","Data":"bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17"} Feb 01 08:27:08 crc kubenswrapper[5127]: I0201 08:27:08.080117 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwfpl" event={"ID":"9fa4108f-99a0-4b74-af01-c75369988b49","Type":"ContainerStarted","Data":"9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123"} Feb 01 08:27:08 crc kubenswrapper[5127]: I0201 08:27:08.097904 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wwfpl" podStartSLOduration=3.65338655 podStartE2EDuration="6.097881926s" podCreationTimestamp="2026-02-01 08:27:02 +0000 UTC" firstStartedPulling="2026-02-01 08:27:05.051163035 +0000 UTC m=+5975.537065408" lastFinishedPulling="2026-02-01 08:27:07.495658411 +0000 UTC m=+5977.981560784" observedRunningTime="2026-02-01 08:27:08.094461765 +0000 UTC m=+5978.580364158" watchObservedRunningTime="2026-02-01 08:27:08.097881926 +0000 UTC m=+5978.583784299" Feb 01 08:27:13 crc kubenswrapper[5127]: I0201 08:27:13.290884 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:13 crc kubenswrapper[5127]: I0201 08:27:13.291878 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:13 crc kubenswrapper[5127]: I0201 08:27:13.358417 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:14 crc kubenswrapper[5127]: I0201 08:27:14.200794 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:14 crc kubenswrapper[5127]: I0201 08:27:14.260392 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wwfpl"] Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.151644 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wwfpl" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="registry-server" containerID="cri-o://9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123" gracePeriod=2 Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.585114 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.682121 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-catalog-content\") pod \"9fa4108f-99a0-4b74-af01-c75369988b49\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.682225 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sm4t\" (UniqueName: \"kubernetes.io/projected/9fa4108f-99a0-4b74-af01-c75369988b49-kube-api-access-2sm4t\") pod \"9fa4108f-99a0-4b74-af01-c75369988b49\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.682259 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-utilities\") pod \"9fa4108f-99a0-4b74-af01-c75369988b49\" (UID: \"9fa4108f-99a0-4b74-af01-c75369988b49\") " Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.683513 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-utilities" (OuterVolumeSpecName: "utilities") pod "9fa4108f-99a0-4b74-af01-c75369988b49" (UID: "9fa4108f-99a0-4b74-af01-c75369988b49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.688482 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa4108f-99a0-4b74-af01-c75369988b49-kube-api-access-2sm4t" (OuterVolumeSpecName: "kube-api-access-2sm4t") pod "9fa4108f-99a0-4b74-af01-c75369988b49" (UID: "9fa4108f-99a0-4b74-af01-c75369988b49"). InnerVolumeSpecName "kube-api-access-2sm4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.784409 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sm4t\" (UniqueName: \"kubernetes.io/projected/9fa4108f-99a0-4b74-af01-c75369988b49-kube-api-access-2sm4t\") on node \"crc\" DevicePath \"\"" Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.784465 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.976401 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fa4108f-99a0-4b74-af01-c75369988b49" (UID: "9fa4108f-99a0-4b74-af01-c75369988b49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:27:16 crc kubenswrapper[5127]: I0201 08:27:16.986916 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fa4108f-99a0-4b74-af01-c75369988b49-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.165153 5127 generic.go:334] "Generic (PLEG): container finished" podID="9fa4108f-99a0-4b74-af01-c75369988b49" containerID="9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123" exitCode=0 Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.165219 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwfpl" event={"ID":"9fa4108f-99a0-4b74-af01-c75369988b49","Type":"ContainerDied","Data":"9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123"} Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.165263 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwfpl" event={"ID":"9fa4108f-99a0-4b74-af01-c75369988b49","Type":"ContainerDied","Data":"7e1a4aa6286013e92d9566892564caccbf629e1ae2790849826f7424d895b411"} Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.165297 5127 scope.go:117] "RemoveContainer" containerID="9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.165504 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwfpl" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.192346 5127 scope.go:117] "RemoveContainer" containerID="bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.225693 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wwfpl"] Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.234871 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wwfpl"] Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.239893 5127 scope.go:117] "RemoveContainer" containerID="52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.300962 5127 scope.go:117] "RemoveContainer" containerID="9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123" Feb 01 08:27:17 crc kubenswrapper[5127]: E0201 08:27:17.301554 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123\": container with ID starting with 9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123 not found: ID does not exist" containerID="9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.301653 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123"} err="failed to get container status \"9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123\": rpc error: code = NotFound desc = could not find container \"9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123\": container with ID starting with 9ad8514f35f185b025cd7807a99f226f53da0ecaa4293f3675e4cf2178c82123 not found: ID does not exist" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.301702 5127 scope.go:117] "RemoveContainer" containerID="bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17" Feb 01 08:27:17 crc kubenswrapper[5127]: E0201 08:27:17.302132 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17\": container with ID starting with bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17 not found: ID does not exist" containerID="bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.302227 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17"} err="failed to get container status \"bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17\": rpc error: code = NotFound desc = could not find container \"bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17\": container with ID starting with bc62d1354eeb19c705e2070739c74e97f42f9585142c2e4ba958c2e45538ef17 not found: ID does not exist" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.302317 5127 scope.go:117] "RemoveContainer" containerID="52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb" Feb 01 08:27:17 crc kubenswrapper[5127]: E0201 08:27:17.302761 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb\": container with ID starting with 52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb not found: ID does not exist" containerID="52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb" Feb 01 08:27:17 crc kubenswrapper[5127]: I0201 08:27:17.302820 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb"} err="failed to get container status \"52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb\": rpc error: code = NotFound desc = could not find container \"52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb\": container with ID starting with 52de0888d18fce6931dd569e8119fe97b7a2a2e6d0e8514430498f067e7ce0cb not found: ID does not exist" Feb 01 08:27:18 crc kubenswrapper[5127]: I0201 08:27:18.245327 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" path="/var/lib/kubelet/pods/9fa4108f-99a0-4b74-af01-c75369988b49/volumes" Feb 01 08:27:24 crc kubenswrapper[5127]: I0201 08:27:24.232782 5127 generic.go:334] "Generic (PLEG): container finished" podID="4052af64-63c0-4e94-bcb9-96463c2e98ce" containerID="c9935bf7151ecb006e0af71cd8c2c0d4ecaba3d35077c4760c7dffbb24c610fb" exitCode=0 Feb 01 08:27:24 crc kubenswrapper[5127]: I0201 08:27:24.232941 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4052af64-63c0-4e94-bcb9-96463c2e98ce","Type":"ContainerDied","Data":"c9935bf7151ecb006e0af71cd8c2c0d4ecaba3d35077c4760c7dffbb24c610fb"} Feb 01 08:27:24 crc kubenswrapper[5127]: I0201 08:27:24.235552 5127 generic.go:334] "Generic (PLEG): container finished" podID="6bb04d8b-7336-442e-ab61-a9a207787027" containerID="5ab801c3ecb1d452807f95431e0dfed464700291729e9f98fd942a86f36f8051" exitCode=0 Feb 01 08:27:24 crc kubenswrapper[5127]: I0201 08:27:24.251718 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bb04d8b-7336-442e-ab61-a9a207787027","Type":"ContainerDied","Data":"5ab801c3ecb1d452807f95431e0dfed464700291729e9f98fd942a86f36f8051"} Feb 01 08:27:25 crc kubenswrapper[5127]: I0201 08:27:25.246704 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4052af64-63c0-4e94-bcb9-96463c2e98ce","Type":"ContainerStarted","Data":"931ca6ffc9f617ae855b317417833ecb807017737fb5ecd25591c668b52e0e16"} Feb 01 08:27:25 crc kubenswrapper[5127]: I0201 08:27:25.247529 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:27:25 crc kubenswrapper[5127]: I0201 08:27:25.248800 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bb04d8b-7336-442e-ab61-a9a207787027","Type":"ContainerStarted","Data":"24673e9dc06f34e71250f24e3362561ab15db3e54fda992e8ef9ba791613b099"} Feb 01 08:27:25 crc kubenswrapper[5127]: I0201 08:27:25.249179 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 01 08:27:25 crc kubenswrapper[5127]: I0201 08:27:25.285642 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.285621782 podStartE2EDuration="37.285621782s" podCreationTimestamp="2026-02-01 08:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:27:25.277425253 +0000 UTC m=+5995.763327636" watchObservedRunningTime="2026-02-01 08:27:25.285621782 +0000 UTC m=+5995.771524155" Feb 01 08:27:25 crc kubenswrapper[5127]: I0201 08:27:25.312224 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.312207846 podStartE2EDuration="37.312207846s" podCreationTimestamp="2026-02-01 08:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:27:25.308717033 +0000 UTC m=+5995.794619406" watchObservedRunningTime="2026-02-01 08:27:25.312207846 +0000 UTC m=+5995.798110219" Feb 01 08:27:36 crc kubenswrapper[5127]: I0201 08:27:36.741412 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:27:36 crc kubenswrapper[5127]: I0201 08:27:36.742125 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:27:36 crc kubenswrapper[5127]: I0201 08:27:36.742193 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:27:36 crc kubenswrapper[5127]: I0201 08:27:36.743161 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:27:36 crc kubenswrapper[5127]: I0201 08:27:36.743262 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" gracePeriod=600 Feb 01 08:27:36 crc kubenswrapper[5127]: E0201 08:27:36.887213 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:27:37 crc kubenswrapper[5127]: I0201 08:27:37.409651 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" exitCode=0 Feb 01 08:27:37 crc kubenswrapper[5127]: I0201 08:27:37.409727 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037"} Feb 01 08:27:37 crc kubenswrapper[5127]: I0201 08:27:37.410380 5127 scope.go:117] "RemoveContainer" containerID="e8546cf522e64bb70f00ae812b7bcad51e2ab924612aab28f74b72ec437b8682" Feb 01 08:27:37 crc kubenswrapper[5127]: I0201 08:27:37.410826 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:27:37 crc kubenswrapper[5127]: E0201 08:27:37.411088 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:27:39 crc kubenswrapper[5127]: I0201 08:27:39.295975 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 01 08:27:39 crc kubenswrapper[5127]: I0201 08:27:39.296991 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.800408 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 01 08:27:45 crc kubenswrapper[5127]: E0201 08:27:45.801804 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="extract-content" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.801832 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="extract-content" Feb 01 08:27:45 crc kubenswrapper[5127]: E0201 08:27:45.801846 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="registry-server" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.801860 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="registry-server" Feb 01 08:27:45 crc kubenswrapper[5127]: E0201 08:27:45.801886 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="extract-utilities" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.801901 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="extract-utilities" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.802228 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa4108f-99a0-4b74-af01-c75369988b49" containerName="registry-server" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.803463 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.808083 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hxbhz" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.815102 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.819777 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxvmh\" (UniqueName: \"kubernetes.io/projected/9b1fd601-c067-479f-b590-fceaf3ecebaa-kube-api-access-nxvmh\") pod \"mariadb-client\" (UID: \"9b1fd601-c067-479f-b590-fceaf3ecebaa\") " pod="openstack/mariadb-client" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.922474 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxvmh\" (UniqueName: \"kubernetes.io/projected/9b1fd601-c067-479f-b590-fceaf3ecebaa-kube-api-access-nxvmh\") pod \"mariadb-client\" (UID: \"9b1fd601-c067-479f-b590-fceaf3ecebaa\") " pod="openstack/mariadb-client" Feb 01 08:27:45 crc kubenswrapper[5127]: I0201 08:27:45.961140 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxvmh\" (UniqueName: \"kubernetes.io/projected/9b1fd601-c067-479f-b590-fceaf3ecebaa-kube-api-access-nxvmh\") pod \"mariadb-client\" (UID: \"9b1fd601-c067-479f-b590-fceaf3ecebaa\") " pod="openstack/mariadb-client" Feb 01 08:27:46 crc kubenswrapper[5127]: I0201 08:27:46.150339 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:27:46 crc kubenswrapper[5127]: I0201 08:27:46.750167 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:27:47 crc kubenswrapper[5127]: I0201 08:27:47.514281 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9b1fd601-c067-479f-b590-fceaf3ecebaa","Type":"ContainerStarted","Data":"6b25385491d72d98320ee657aa5b70aef1bb7dee6baffb4c0ededc916ce1dae7"} Feb 01 08:27:50 crc kubenswrapper[5127]: I0201 08:27:50.540516 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9b1fd601-c067-479f-b590-fceaf3ecebaa","Type":"ContainerStarted","Data":"16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5"} Feb 01 08:27:50 crc kubenswrapper[5127]: I0201 08:27:50.564180 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.042830814 podStartE2EDuration="5.564152274s" podCreationTimestamp="2026-02-01 08:27:45 +0000 UTC" firstStartedPulling="2026-02-01 08:27:46.75943878 +0000 UTC m=+6017.245341153" lastFinishedPulling="2026-02-01 08:27:50.28076021 +0000 UTC m=+6020.766662613" observedRunningTime="2026-02-01 08:27:50.560425114 +0000 UTC m=+6021.046327497" watchObservedRunningTime="2026-02-01 08:27:50.564152274 +0000 UTC m=+6021.050054677" Feb 01 08:27:52 crc kubenswrapper[5127]: I0201 08:27:52.236214 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:27:52 crc kubenswrapper[5127]: E0201 08:27:52.237998 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:28:01 crc kubenswrapper[5127]: E0201 08:28:01.059016 5127 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.22:42696->38.102.83.22:39685: write tcp 38.102.83.22:42696->38.102.83.22:39685: write: broken pipe Feb 01 08:28:03 crc kubenswrapper[5127]: I0201 08:28:03.236633 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:28:03 crc kubenswrapper[5127]: E0201 08:28:03.237377 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:28:04 crc kubenswrapper[5127]: I0201 08:28:04.982917 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:28:04 crc kubenswrapper[5127]: I0201 08:28:04.983224 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="9b1fd601-c067-479f-b590-fceaf3ecebaa" containerName="mariadb-client" containerID="cri-o://16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5" gracePeriod=30 Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.466850 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.596120 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxvmh\" (UniqueName: \"kubernetes.io/projected/9b1fd601-c067-479f-b590-fceaf3ecebaa-kube-api-access-nxvmh\") pod \"9b1fd601-c067-479f-b590-fceaf3ecebaa\" (UID: \"9b1fd601-c067-479f-b590-fceaf3ecebaa\") " Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.600541 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1fd601-c067-479f-b590-fceaf3ecebaa-kube-api-access-nxvmh" (OuterVolumeSpecName: "kube-api-access-nxvmh") pod "9b1fd601-c067-479f-b590-fceaf3ecebaa" (UID: "9b1fd601-c067-479f-b590-fceaf3ecebaa"). InnerVolumeSpecName "kube-api-access-nxvmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.698262 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxvmh\" (UniqueName: \"kubernetes.io/projected/9b1fd601-c067-479f-b590-fceaf3ecebaa-kube-api-access-nxvmh\") on node \"crc\" DevicePath \"\"" Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.711338 5127 generic.go:334] "Generic (PLEG): container finished" podID="9b1fd601-c067-479f-b590-fceaf3ecebaa" containerID="16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5" exitCode=143 Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.711381 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9b1fd601-c067-479f-b590-fceaf3ecebaa","Type":"ContainerDied","Data":"16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5"} Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.711415 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"9b1fd601-c067-479f-b590-fceaf3ecebaa","Type":"ContainerDied","Data":"6b25385491d72d98320ee657aa5b70aef1bb7dee6baffb4c0ededc916ce1dae7"} Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.711417 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.711434 5127 scope.go:117] "RemoveContainer" containerID="16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5" Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.748969 5127 scope.go:117] "RemoveContainer" containerID="16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5" Feb 01 08:28:05 crc kubenswrapper[5127]: E0201 08:28:05.749891 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5\": container with ID starting with 16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5 not found: ID does not exist" containerID="16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5" Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.749953 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5"} err="failed to get container status \"16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5\": rpc error: code = NotFound desc = could not find container \"16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5\": container with ID starting with 16110bd00a5e9d7b451b096afdb966a3c59d6f291bb746f1342f11047b4c8fb5 not found: ID does not exist" Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.750306 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:28:05 crc kubenswrapper[5127]: I0201 08:28:05.756964 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:28:06 crc kubenswrapper[5127]: I0201 08:28:06.248307 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1fd601-c067-479f-b590-fceaf3ecebaa" path="/var/lib/kubelet/pods/9b1fd601-c067-479f-b590-fceaf3ecebaa/volumes" Feb 01 08:28:14 crc kubenswrapper[5127]: I0201 08:28:14.236320 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:28:14 crc kubenswrapper[5127]: E0201 08:28:14.237163 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:28:27 crc kubenswrapper[5127]: I0201 08:28:27.236572 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:28:27 crc kubenswrapper[5127]: E0201 08:28:27.239558 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:28:42 crc kubenswrapper[5127]: I0201 08:28:42.236193 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:28:42 crc kubenswrapper[5127]: E0201 08:28:42.237163 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:28:54 crc kubenswrapper[5127]: I0201 08:28:54.237754 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:28:54 crc kubenswrapper[5127]: E0201 08:28:54.238432 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:29:05 crc kubenswrapper[5127]: I0201 08:29:05.236741 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:29:05 crc kubenswrapper[5127]: E0201 08:29:05.237969 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:29:20 crc kubenswrapper[5127]: I0201 08:29:20.247380 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:29:20 crc kubenswrapper[5127]: E0201 08:29:20.248647 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:29:32 crc kubenswrapper[5127]: I0201 08:29:32.236176 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:29:32 crc kubenswrapper[5127]: E0201 08:29:32.237194 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:29:42 crc kubenswrapper[5127]: I0201 08:29:42.044954 5127 scope.go:117] "RemoveContainer" containerID="a4f6dd18fed9b260bde2d3f024a9919f17a73a359c70b5a50e01f5245bab08ca" Feb 01 08:29:46 crc kubenswrapper[5127]: I0201 08:29:46.235931 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:29:46 crc kubenswrapper[5127]: E0201 08:29:46.236909 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:29:57 crc kubenswrapper[5127]: I0201 08:29:57.236415 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:29:57 crc kubenswrapper[5127]: E0201 08:29:57.237634 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.146983 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k"] Feb 01 08:30:00 crc kubenswrapper[5127]: E0201 08:30:00.147548 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1fd601-c067-479f-b590-fceaf3ecebaa" containerName="mariadb-client" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.147561 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1fd601-c067-479f-b590-fceaf3ecebaa" containerName="mariadb-client" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.147724 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1fd601-c067-479f-b590-fceaf3ecebaa" containerName="mariadb-client" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.148182 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.150823 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.150859 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.156232 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg2gc\" (UniqueName: \"kubernetes.io/projected/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-kube-api-access-mg2gc\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.156296 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-secret-volume\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.156657 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-config-volume\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.164212 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k"] Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.258242 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg2gc\" (UniqueName: \"kubernetes.io/projected/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-kube-api-access-mg2gc\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.258332 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-secret-volume\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.258490 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-config-volume\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.259464 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-config-volume\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.266593 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-secret-volume\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.278193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg2gc\" (UniqueName: \"kubernetes.io/projected/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-kube-api-access-mg2gc\") pod \"collect-profiles-29498910-fz78k\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.474462 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:00 crc kubenswrapper[5127]: I0201 08:30:00.901607 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k"] Feb 01 08:30:01 crc kubenswrapper[5127]: I0201 08:30:01.837146 5127 generic.go:334] "Generic (PLEG): container finished" podID="afc22c56-417d-47ce-92f5-72f5b9e3d2fb" containerID="d5b8075211897ba9c810a285fde9d758dcf29f5e819abd0685e33ddb0e4a6bd0" exitCode=0 Feb 01 08:30:01 crc kubenswrapper[5127]: I0201 08:30:01.837202 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" event={"ID":"afc22c56-417d-47ce-92f5-72f5b9e3d2fb","Type":"ContainerDied","Data":"d5b8075211897ba9c810a285fde9d758dcf29f5e819abd0685e33ddb0e4a6bd0"} Feb 01 08:30:01 crc kubenswrapper[5127]: I0201 08:30:01.837537 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" event={"ID":"afc22c56-417d-47ce-92f5-72f5b9e3d2fb","Type":"ContainerStarted","Data":"ab6830da2f28f4b8ec37bfcd256764238e9adf1db1d5eff817a753d05ed7f969"} Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.146466 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.305471 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg2gc\" (UniqueName: \"kubernetes.io/projected/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-kube-api-access-mg2gc\") pod \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.305562 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-secret-volume\") pod \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.305643 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-config-volume\") pod \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\" (UID: \"afc22c56-417d-47ce-92f5-72f5b9e3d2fb\") " Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.306644 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-config-volume" (OuterVolumeSpecName: "config-volume") pod "afc22c56-417d-47ce-92f5-72f5b9e3d2fb" (UID: "afc22c56-417d-47ce-92f5-72f5b9e3d2fb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.315454 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "afc22c56-417d-47ce-92f5-72f5b9e3d2fb" (UID: "afc22c56-417d-47ce-92f5-72f5b9e3d2fb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.315464 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-kube-api-access-mg2gc" (OuterVolumeSpecName: "kube-api-access-mg2gc") pod "afc22c56-417d-47ce-92f5-72f5b9e3d2fb" (UID: "afc22c56-417d-47ce-92f5-72f5b9e3d2fb"). InnerVolumeSpecName "kube-api-access-mg2gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.407365 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg2gc\" (UniqueName: \"kubernetes.io/projected/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-kube-api-access-mg2gc\") on node \"crc\" DevicePath \"\"" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.407399 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.407409 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/afc22c56-417d-47ce-92f5-72f5b9e3d2fb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.860466 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" event={"ID":"afc22c56-417d-47ce-92f5-72f5b9e3d2fb","Type":"ContainerDied","Data":"ab6830da2f28f4b8ec37bfcd256764238e9adf1db1d5eff817a753d05ed7f969"} Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.860509 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab6830da2f28f4b8ec37bfcd256764238e9adf1db1d5eff817a753d05ed7f969" Feb 01 08:30:03 crc kubenswrapper[5127]: I0201 08:30:03.860646 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k" Feb 01 08:30:04 crc kubenswrapper[5127]: I0201 08:30:04.218810 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn"] Feb 01 08:30:04 crc kubenswrapper[5127]: I0201 08:30:04.226247 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498865-4pksn"] Feb 01 08:30:04 crc kubenswrapper[5127]: I0201 08:30:04.246062 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8335c315-d03b-495f-99c6-26d8bf68938a" path="/var/lib/kubelet/pods/8335c315-d03b-495f-99c6-26d8bf68938a/volumes" Feb 01 08:30:08 crc kubenswrapper[5127]: I0201 08:30:08.236044 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:30:08 crc kubenswrapper[5127]: E0201 08:30:08.236465 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:30:23 crc kubenswrapper[5127]: I0201 08:30:23.236013 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:30:23 crc kubenswrapper[5127]: E0201 08:30:23.236916 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:30:38 crc kubenswrapper[5127]: I0201 08:30:38.236441 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:30:38 crc kubenswrapper[5127]: E0201 08:30:38.237313 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:30:42 crc kubenswrapper[5127]: I0201 08:30:42.118737 5127 scope.go:117] "RemoveContainer" containerID="e1c6cba150a023d7c39a6c73b619f811ea3c6900f8f91ebe0fe28f34e682b4dc" Feb 01 08:30:49 crc kubenswrapper[5127]: I0201 08:30:49.236245 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:30:49 crc kubenswrapper[5127]: E0201 08:30:49.237554 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.788033 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h88hs"] Feb 01 08:30:54 crc kubenswrapper[5127]: E0201 08:30:54.789432 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc22c56-417d-47ce-92f5-72f5b9e3d2fb" containerName="collect-profiles" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.789483 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc22c56-417d-47ce-92f5-72f5b9e3d2fb" containerName="collect-profiles" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.789981 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc22c56-417d-47ce-92f5-72f5b9e3d2fb" containerName="collect-profiles" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.792495 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.804390 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h88hs"] Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.853851 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-catalog-content\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.854028 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mkmq\" (UniqueName: \"kubernetes.io/projected/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-kube-api-access-5mkmq\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.854124 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-utilities\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.955429 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-catalog-content\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.955517 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mkmq\" (UniqueName: \"kubernetes.io/projected/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-kube-api-access-5mkmq\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.955577 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-utilities\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.956111 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-catalog-content\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.956169 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-utilities\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:54 crc kubenswrapper[5127]: I0201 08:30:54.976396 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mkmq\" (UniqueName: \"kubernetes.io/projected/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-kube-api-access-5mkmq\") pod \"redhat-marketplace-h88hs\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:55 crc kubenswrapper[5127]: I0201 08:30:55.128871 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:30:55 crc kubenswrapper[5127]: I0201 08:30:55.580410 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h88hs"] Feb 01 08:30:56 crc kubenswrapper[5127]: I0201 08:30:56.314083 5127 generic.go:334] "Generic (PLEG): container finished" podID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerID="0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753" exitCode=0 Feb 01 08:30:56 crc kubenswrapper[5127]: I0201 08:30:56.314166 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h88hs" event={"ID":"8658fe26-a6e4-4f24-a73a-1b53aef2ab32","Type":"ContainerDied","Data":"0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753"} Feb 01 08:30:56 crc kubenswrapper[5127]: I0201 08:30:56.314452 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h88hs" event={"ID":"8658fe26-a6e4-4f24-a73a-1b53aef2ab32","Type":"ContainerStarted","Data":"8424344157b7c6e18076a4b4b2b53331b4d06640db52d19e5486fdd23bba81aa"} Feb 01 08:30:56 crc kubenswrapper[5127]: I0201 08:30:56.317154 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:30:57 crc kubenswrapper[5127]: I0201 08:30:57.321543 5127 generic.go:334] "Generic (PLEG): container finished" podID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerID="0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394" exitCode=0 Feb 01 08:30:57 crc kubenswrapper[5127]: I0201 08:30:57.321754 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h88hs" event={"ID":"8658fe26-a6e4-4f24-a73a-1b53aef2ab32","Type":"ContainerDied","Data":"0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394"} Feb 01 08:30:59 crc kubenswrapper[5127]: I0201 08:30:59.338139 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h88hs" event={"ID":"8658fe26-a6e4-4f24-a73a-1b53aef2ab32","Type":"ContainerStarted","Data":"ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724"} Feb 01 08:30:59 crc kubenswrapper[5127]: I0201 08:30:59.362810 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h88hs" podStartSLOduration=3.498902443 podStartE2EDuration="5.362792391s" podCreationTimestamp="2026-02-01 08:30:54 +0000 UTC" firstStartedPulling="2026-02-01 08:30:56.316759348 +0000 UTC m=+6206.802661751" lastFinishedPulling="2026-02-01 08:30:58.180649336 +0000 UTC m=+6208.666551699" observedRunningTime="2026-02-01 08:30:59.354185829 +0000 UTC m=+6209.840088212" watchObservedRunningTime="2026-02-01 08:30:59.362792391 +0000 UTC m=+6209.848694754" Feb 01 08:31:00 crc kubenswrapper[5127]: I0201 08:31:00.240094 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:31:00 crc kubenswrapper[5127]: E0201 08:31:00.240816 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:31:05 crc kubenswrapper[5127]: I0201 08:31:05.129709 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:31:05 crc kubenswrapper[5127]: I0201 08:31:05.130446 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:31:05 crc kubenswrapper[5127]: I0201 08:31:05.181258 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:31:05 crc kubenswrapper[5127]: I0201 08:31:05.475317 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:31:05 crc kubenswrapper[5127]: I0201 08:31:05.555955 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h88hs"] Feb 01 08:31:07 crc kubenswrapper[5127]: I0201 08:31:07.425119 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h88hs" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="registry-server" containerID="cri-o://ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724" gracePeriod=2 Feb 01 08:31:07 crc kubenswrapper[5127]: I0201 08:31:07.882893 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:31:07 crc kubenswrapper[5127]: I0201 08:31:07.990117 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-utilities\") pod \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " Feb 01 08:31:07 crc kubenswrapper[5127]: I0201 08:31:07.990245 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-catalog-content\") pod \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " Feb 01 08:31:07 crc kubenswrapper[5127]: I0201 08:31:07.990312 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mkmq\" (UniqueName: \"kubernetes.io/projected/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-kube-api-access-5mkmq\") pod \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\" (UID: \"8658fe26-a6e4-4f24-a73a-1b53aef2ab32\") " Feb 01 08:31:07 crc kubenswrapper[5127]: I0201 08:31:07.992683 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-utilities" (OuterVolumeSpecName: "utilities") pod "8658fe26-a6e4-4f24-a73a-1b53aef2ab32" (UID: "8658fe26-a6e4-4f24-a73a-1b53aef2ab32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:07 crc kubenswrapper[5127]: I0201 08:31:07.997414 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-kube-api-access-5mkmq" (OuterVolumeSpecName: "kube-api-access-5mkmq") pod "8658fe26-a6e4-4f24-a73a-1b53aef2ab32" (UID: "8658fe26-a6e4-4f24-a73a-1b53aef2ab32"). InnerVolumeSpecName "kube-api-access-5mkmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.026578 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8658fe26-a6e4-4f24-a73a-1b53aef2ab32" (UID: "8658fe26-a6e4-4f24-a73a-1b53aef2ab32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.092269 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.092312 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.092327 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mkmq\" (UniqueName: \"kubernetes.io/projected/8658fe26-a6e4-4f24-a73a-1b53aef2ab32-kube-api-access-5mkmq\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:08 crc kubenswrapper[5127]: E0201 08:31:08.346175 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8658fe26_a6e4_4f24_a73a_1b53aef2ab32.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.437921 5127 generic.go:334] "Generic (PLEG): container finished" podID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerID="ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724" exitCode=0 Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.437960 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h88hs" event={"ID":"8658fe26-a6e4-4f24-a73a-1b53aef2ab32","Type":"ContainerDied","Data":"ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724"} Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.437988 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h88hs" event={"ID":"8658fe26-a6e4-4f24-a73a-1b53aef2ab32","Type":"ContainerDied","Data":"8424344157b7c6e18076a4b4b2b53331b4d06640db52d19e5486fdd23bba81aa"} Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.438006 5127 scope.go:117] "RemoveContainer" containerID="ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.438046 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h88hs" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.476367 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h88hs"] Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.480770 5127 scope.go:117] "RemoveContainer" containerID="0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.487273 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h88hs"] Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.508722 5127 scope.go:117] "RemoveContainer" containerID="0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.537328 5127 scope.go:117] "RemoveContainer" containerID="ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724" Feb 01 08:31:08 crc kubenswrapper[5127]: E0201 08:31:08.537743 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724\": container with ID starting with ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724 not found: ID does not exist" containerID="ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.537803 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724"} err="failed to get container status \"ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724\": rpc error: code = NotFound desc = could not find container \"ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724\": container with ID starting with ee584487767fec3846beef9d6dccb9ddb93c97e9d263bdc53e8946e8a61d6724 not found: ID does not exist" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.537852 5127 scope.go:117] "RemoveContainer" containerID="0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394" Feb 01 08:31:08 crc kubenswrapper[5127]: E0201 08:31:08.538488 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394\": container with ID starting with 0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394 not found: ID does not exist" containerID="0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.538526 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394"} err="failed to get container status \"0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394\": rpc error: code = NotFound desc = could not find container \"0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394\": container with ID starting with 0a5e2671572de198b57cb5417f83e97403c040eced0373673819bf765be5a394 not found: ID does not exist" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.538550 5127 scope.go:117] "RemoveContainer" containerID="0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753" Feb 01 08:31:08 crc kubenswrapper[5127]: E0201 08:31:08.539113 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753\": container with ID starting with 0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753 not found: ID does not exist" containerID="0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753" Feb 01 08:31:08 crc kubenswrapper[5127]: I0201 08:31:08.539135 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753"} err="failed to get container status \"0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753\": rpc error: code = NotFound desc = could not find container \"0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753\": container with ID starting with 0a96eaea86bf4ebb822002d5200566cc07ef238824c154cc3f4c872156373753 not found: ID does not exist" Feb 01 08:31:10 crc kubenswrapper[5127]: I0201 08:31:10.247327 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" path="/var/lib/kubelet/pods/8658fe26-a6e4-4f24-a73a-1b53aef2ab32/volumes" Feb 01 08:31:13 crc kubenswrapper[5127]: I0201 08:31:13.239480 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:31:13 crc kubenswrapper[5127]: E0201 08:31:13.240296 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:31:25 crc kubenswrapper[5127]: I0201 08:31:25.235392 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:31:25 crc kubenswrapper[5127]: E0201 08:31:25.236179 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.514857 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d9mpt"] Feb 01 08:31:32 crc kubenswrapper[5127]: E0201 08:31:32.515861 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="extract-utilities" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.515882 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="extract-utilities" Feb 01 08:31:32 crc kubenswrapper[5127]: E0201 08:31:32.515929 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="extract-content" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.515943 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="extract-content" Feb 01 08:31:32 crc kubenswrapper[5127]: E0201 08:31:32.515959 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="registry-server" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.515967 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="registry-server" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.516150 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8658fe26-a6e4-4f24-a73a-1b53aef2ab32" containerName="registry-server" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.517713 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.540047 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d9mpt"] Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.608444 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-utilities\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.608505 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-catalog-content\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.608727 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb7ct\" (UniqueName: \"kubernetes.io/projected/89a46c33-a896-41f2-b479-3e864a204722-kube-api-access-zb7ct\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.709975 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb7ct\" (UniqueName: \"kubernetes.io/projected/89a46c33-a896-41f2-b479-3e864a204722-kube-api-access-zb7ct\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.710080 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-utilities\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.710117 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-catalog-content\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.710612 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-catalog-content\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.710936 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-utilities\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.734784 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb7ct\" (UniqueName: \"kubernetes.io/projected/89a46c33-a896-41f2-b479-3e864a204722-kube-api-access-zb7ct\") pod \"certified-operators-d9mpt\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:32 crc kubenswrapper[5127]: I0201 08:31:32.840550 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:33 crc kubenswrapper[5127]: I0201 08:31:33.319701 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d9mpt"] Feb 01 08:31:33 crc kubenswrapper[5127]: I0201 08:31:33.647765 5127 generic.go:334] "Generic (PLEG): container finished" podID="89a46c33-a896-41f2-b479-3e864a204722" containerID="aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4" exitCode=0 Feb 01 08:31:33 crc kubenswrapper[5127]: I0201 08:31:33.647816 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9mpt" event={"ID":"89a46c33-a896-41f2-b479-3e864a204722","Type":"ContainerDied","Data":"aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4"} Feb 01 08:31:33 crc kubenswrapper[5127]: I0201 08:31:33.647851 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9mpt" event={"ID":"89a46c33-a896-41f2-b479-3e864a204722","Type":"ContainerStarted","Data":"c89400cbfa3064536fdb255ba25a83408e173f9fa2e4509b4047d7fb6c08cfbe"} Feb 01 08:31:34 crc kubenswrapper[5127]: I0201 08:31:34.658695 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9mpt" event={"ID":"89a46c33-a896-41f2-b479-3e864a204722","Type":"ContainerStarted","Data":"da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87"} Feb 01 08:31:35 crc kubenswrapper[5127]: I0201 08:31:35.667893 5127 generic.go:334] "Generic (PLEG): container finished" podID="89a46c33-a896-41f2-b479-3e864a204722" containerID="da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87" exitCode=0 Feb 01 08:31:35 crc kubenswrapper[5127]: I0201 08:31:35.667961 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9mpt" event={"ID":"89a46c33-a896-41f2-b479-3e864a204722","Type":"ContainerDied","Data":"da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87"} Feb 01 08:31:36 crc kubenswrapper[5127]: I0201 08:31:36.675280 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9mpt" event={"ID":"89a46c33-a896-41f2-b479-3e864a204722","Type":"ContainerStarted","Data":"56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e"} Feb 01 08:31:36 crc kubenswrapper[5127]: I0201 08:31:36.712641 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d9mpt" podStartSLOduration=2.284245613 podStartE2EDuration="4.712620367s" podCreationTimestamp="2026-02-01 08:31:32 +0000 UTC" firstStartedPulling="2026-02-01 08:31:33.649766623 +0000 UTC m=+6244.135668986" lastFinishedPulling="2026-02-01 08:31:36.078141367 +0000 UTC m=+6246.564043740" observedRunningTime="2026-02-01 08:31:36.707991013 +0000 UTC m=+6247.193893376" watchObservedRunningTime="2026-02-01 08:31:36.712620367 +0000 UTC m=+6247.198522740" Feb 01 08:31:38 crc kubenswrapper[5127]: I0201 08:31:38.235904 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:31:38 crc kubenswrapper[5127]: E0201 08:31:38.236392 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:31:42 crc kubenswrapper[5127]: I0201 08:31:42.840727 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:42 crc kubenswrapper[5127]: I0201 08:31:42.841214 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:42 crc kubenswrapper[5127]: I0201 08:31:42.900372 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:43 crc kubenswrapper[5127]: I0201 08:31:43.784277 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:43 crc kubenswrapper[5127]: I0201 08:31:43.836420 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d9mpt"] Feb 01 08:31:45 crc kubenswrapper[5127]: I0201 08:31:45.753643 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d9mpt" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="registry-server" containerID="cri-o://56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e" gracePeriod=2 Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.155293 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.233878 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-utilities\") pod \"89a46c33-a896-41f2-b479-3e864a204722\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.233967 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb7ct\" (UniqueName: \"kubernetes.io/projected/89a46c33-a896-41f2-b479-3e864a204722-kube-api-access-zb7ct\") pod \"89a46c33-a896-41f2-b479-3e864a204722\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.234073 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-catalog-content\") pod \"89a46c33-a896-41f2-b479-3e864a204722\" (UID: \"89a46c33-a896-41f2-b479-3e864a204722\") " Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.234828 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-utilities" (OuterVolumeSpecName: "utilities") pod "89a46c33-a896-41f2-b479-3e864a204722" (UID: "89a46c33-a896-41f2-b479-3e864a204722"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.242043 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a46c33-a896-41f2-b479-3e864a204722-kube-api-access-zb7ct" (OuterVolumeSpecName: "kube-api-access-zb7ct") pod "89a46c33-a896-41f2-b479-3e864a204722" (UID: "89a46c33-a896-41f2-b479-3e864a204722"). InnerVolumeSpecName "kube-api-access-zb7ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.242973 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.243144 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb7ct\" (UniqueName: \"kubernetes.io/projected/89a46c33-a896-41f2-b479-3e864a204722-kube-api-access-zb7ct\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.300998 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89a46c33-a896-41f2-b479-3e864a204722" (UID: "89a46c33-a896-41f2-b479-3e864a204722"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.345530 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89a46c33-a896-41f2-b479-3e864a204722-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.767751 5127 generic.go:334] "Generic (PLEG): container finished" podID="89a46c33-a896-41f2-b479-3e864a204722" containerID="56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e" exitCode=0 Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.767831 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9mpt" event={"ID":"89a46c33-a896-41f2-b479-3e864a204722","Type":"ContainerDied","Data":"56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e"} Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.767886 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d9mpt" event={"ID":"89a46c33-a896-41f2-b479-3e864a204722","Type":"ContainerDied","Data":"c89400cbfa3064536fdb255ba25a83408e173f9fa2e4509b4047d7fb6c08cfbe"} Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.767928 5127 scope.go:117] "RemoveContainer" containerID="56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.767943 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d9mpt" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.804458 5127 scope.go:117] "RemoveContainer" containerID="da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.834309 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d9mpt"] Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.839901 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d9mpt"] Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.845163 5127 scope.go:117] "RemoveContainer" containerID="aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.887489 5127 scope.go:117] "RemoveContainer" containerID="56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e" Feb 01 08:31:46 crc kubenswrapper[5127]: E0201 08:31:46.888052 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e\": container with ID starting with 56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e not found: ID does not exist" containerID="56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.888101 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e"} err="failed to get container status \"56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e\": rpc error: code = NotFound desc = could not find container \"56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e\": container with ID starting with 56b80b9fdfb05b9897b01d4bbb1a97328f480a5893157dbb2734465231b5af8e not found: ID does not exist" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.888138 5127 scope.go:117] "RemoveContainer" containerID="da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87" Feb 01 08:31:46 crc kubenswrapper[5127]: E0201 08:31:46.888809 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87\": container with ID starting with da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87 not found: ID does not exist" containerID="da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.888878 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87"} err="failed to get container status \"da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87\": rpc error: code = NotFound desc = could not find container \"da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87\": container with ID starting with da71aa9c0cd8b966df524813a663775a7c02b2f8ec42f703ff87e3aeb139eb87 not found: ID does not exist" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.888916 5127 scope.go:117] "RemoveContainer" containerID="aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4" Feb 01 08:31:46 crc kubenswrapper[5127]: E0201 08:31:46.889277 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4\": container with ID starting with aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4 not found: ID does not exist" containerID="aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4" Feb 01 08:31:46 crc kubenswrapper[5127]: I0201 08:31:46.889317 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4"} err="failed to get container status \"aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4\": rpc error: code = NotFound desc = could not find container \"aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4\": container with ID starting with aa8d08b81ff082fa15de224e1d86133065ba7a723b81bc66d0068b5a3076e5f4 not found: ID does not exist" Feb 01 08:31:48 crc kubenswrapper[5127]: I0201 08:31:48.252614 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a46c33-a896-41f2-b479-3e864a204722" path="/var/lib/kubelet/pods/89a46c33-a896-41f2-b479-3e864a204722/volumes" Feb 01 08:31:49 crc kubenswrapper[5127]: I0201 08:31:49.236004 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:31:49 crc kubenswrapper[5127]: E0201 08:31:49.236897 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:32:02 crc kubenswrapper[5127]: I0201 08:32:02.236570 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:32:02 crc kubenswrapper[5127]: E0201 08:32:02.238303 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:32:14 crc kubenswrapper[5127]: I0201 08:32:14.235814 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:32:14 crc kubenswrapper[5127]: E0201 08:32:14.237046 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:32:26 crc kubenswrapper[5127]: I0201 08:32:26.236104 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:32:26 crc kubenswrapper[5127]: E0201 08:32:26.237146 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:32:40 crc kubenswrapper[5127]: I0201 08:32:40.248879 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:32:41 crc kubenswrapper[5127]: I0201 08:32:41.317448 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"1313965be7bf98e2a15f3d3d8432b5e3945ebbb0fff233b7d904dc276cb91592"} Feb 01 08:32:42 crc kubenswrapper[5127]: I0201 08:32:42.218781 5127 scope.go:117] "RemoveContainer" containerID="40f8c3eaeef4f01d60f6e71ca05593645fde76f964d2077b56989684c7973158" Feb 01 08:32:42 crc kubenswrapper[5127]: I0201 08:32:42.256017 5127 scope.go:117] "RemoveContainer" containerID="6bb87abcdea6b045b58aa2a6d2b524b65577a9c446772269e91c56253472d38e" Feb 01 08:32:42 crc kubenswrapper[5127]: I0201 08:32:42.287033 5127 scope.go:117] "RemoveContainer" containerID="04c22418fcdc452fdd928cd2113c1c1d57e83520c4db833b7a172d004af3f2f5" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.471613 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xvdrg"] Feb 01 08:33:30 crc kubenswrapper[5127]: E0201 08:33:30.472757 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="extract-utilities" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.472780 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="extract-utilities" Feb 01 08:33:30 crc kubenswrapper[5127]: E0201 08:33:30.472824 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="registry-server" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.472835 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="registry-server" Feb 01 08:33:30 crc kubenswrapper[5127]: E0201 08:33:30.472851 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="extract-content" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.472862 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="extract-content" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.473058 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="89a46c33-a896-41f2-b479-3e864a204722" containerName="registry-server" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.474451 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.484273 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvdrg"] Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.550321 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-utilities\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.550428 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-catalog-content\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.550565 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsd5l\" (UniqueName: \"kubernetes.io/projected/ed72001c-10a1-40ed-a379-bc416583cc76-kube-api-access-fsd5l\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.651815 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-utilities\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.651892 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-catalog-content\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.651926 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsd5l\" (UniqueName: \"kubernetes.io/projected/ed72001c-10a1-40ed-a379-bc416583cc76-kube-api-access-fsd5l\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.652512 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-catalog-content\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.652494 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-utilities\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.673085 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsd5l\" (UniqueName: \"kubernetes.io/projected/ed72001c-10a1-40ed-a379-bc416583cc76-kube-api-access-fsd5l\") pod \"redhat-operators-xvdrg\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:30 crc kubenswrapper[5127]: I0201 08:33:30.812361 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:31 crc kubenswrapper[5127]: I0201 08:33:31.255141 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xvdrg"] Feb 01 08:33:31 crc kubenswrapper[5127]: I0201 08:33:31.808463 5127 generic.go:334] "Generic (PLEG): container finished" podID="ed72001c-10a1-40ed-a379-bc416583cc76" containerID="c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed" exitCode=0 Feb 01 08:33:31 crc kubenswrapper[5127]: I0201 08:33:31.808554 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvdrg" event={"ID":"ed72001c-10a1-40ed-a379-bc416583cc76","Type":"ContainerDied","Data":"c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed"} Feb 01 08:33:31 crc kubenswrapper[5127]: I0201 08:33:31.809013 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvdrg" event={"ID":"ed72001c-10a1-40ed-a379-bc416583cc76","Type":"ContainerStarted","Data":"85ee8b336e512bf82aafed302f03d5e414eaf88c8bc1605d3825815d4908f577"} Feb 01 08:33:32 crc kubenswrapper[5127]: I0201 08:33:32.818415 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvdrg" event={"ID":"ed72001c-10a1-40ed-a379-bc416583cc76","Type":"ContainerStarted","Data":"5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20"} Feb 01 08:33:33 crc kubenswrapper[5127]: I0201 08:33:33.828349 5127 generic.go:334] "Generic (PLEG): container finished" podID="ed72001c-10a1-40ed-a379-bc416583cc76" containerID="5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20" exitCode=0 Feb 01 08:33:33 crc kubenswrapper[5127]: I0201 08:33:33.828517 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvdrg" event={"ID":"ed72001c-10a1-40ed-a379-bc416583cc76","Type":"ContainerDied","Data":"5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20"} Feb 01 08:33:34 crc kubenswrapper[5127]: I0201 08:33:34.840144 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvdrg" event={"ID":"ed72001c-10a1-40ed-a379-bc416583cc76","Type":"ContainerStarted","Data":"e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e"} Feb 01 08:33:34 crc kubenswrapper[5127]: I0201 08:33:34.875359 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xvdrg" podStartSLOduration=2.464834079 podStartE2EDuration="4.875329222s" podCreationTimestamp="2026-02-01 08:33:30 +0000 UTC" firstStartedPulling="2026-02-01 08:33:31.810758671 +0000 UTC m=+6362.296661034" lastFinishedPulling="2026-02-01 08:33:34.221253774 +0000 UTC m=+6364.707156177" observedRunningTime="2026-02-01 08:33:34.866244547 +0000 UTC m=+6365.352146980" watchObservedRunningTime="2026-02-01 08:33:34.875329222 +0000 UTC m=+6365.361231625" Feb 01 08:33:40 crc kubenswrapper[5127]: I0201 08:33:40.813022 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:40 crc kubenswrapper[5127]: I0201 08:33:40.813524 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:41 crc kubenswrapper[5127]: I0201 08:33:41.889341 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xvdrg" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="registry-server" probeResult="failure" output=< Feb 01 08:33:41 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 08:33:41 crc kubenswrapper[5127]: > Feb 01 08:33:50 crc kubenswrapper[5127]: I0201 08:33:50.881828 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:50 crc kubenswrapper[5127]: I0201 08:33:50.958961 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:51 crc kubenswrapper[5127]: I0201 08:33:51.121863 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvdrg"] Feb 01 08:33:51 crc kubenswrapper[5127]: I0201 08:33:51.999325 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xvdrg" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="registry-server" containerID="cri-o://e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e" gracePeriod=2 Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.558059 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.631331 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-utilities\") pod \"ed72001c-10a1-40ed-a379-bc416583cc76\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.631488 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsd5l\" (UniqueName: \"kubernetes.io/projected/ed72001c-10a1-40ed-a379-bc416583cc76-kube-api-access-fsd5l\") pod \"ed72001c-10a1-40ed-a379-bc416583cc76\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.631632 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-catalog-content\") pod \"ed72001c-10a1-40ed-a379-bc416583cc76\" (UID: \"ed72001c-10a1-40ed-a379-bc416583cc76\") " Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.634565 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-utilities" (OuterVolumeSpecName: "utilities") pod "ed72001c-10a1-40ed-a379-bc416583cc76" (UID: "ed72001c-10a1-40ed-a379-bc416583cc76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.643160 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed72001c-10a1-40ed-a379-bc416583cc76-kube-api-access-fsd5l" (OuterVolumeSpecName: "kube-api-access-fsd5l") pod "ed72001c-10a1-40ed-a379-bc416583cc76" (UID: "ed72001c-10a1-40ed-a379-bc416583cc76"). InnerVolumeSpecName "kube-api-access-fsd5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.733403 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.733440 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsd5l\" (UniqueName: \"kubernetes.io/projected/ed72001c-10a1-40ed-a379-bc416583cc76-kube-api-access-fsd5l\") on node \"crc\" DevicePath \"\"" Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.761151 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed72001c-10a1-40ed-a379-bc416583cc76" (UID: "ed72001c-10a1-40ed-a379-bc416583cc76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:33:52 crc kubenswrapper[5127]: I0201 08:33:52.835135 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed72001c-10a1-40ed-a379-bc416583cc76-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.011087 5127 generic.go:334] "Generic (PLEG): container finished" podID="ed72001c-10a1-40ed-a379-bc416583cc76" containerID="e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e" exitCode=0 Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.011147 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xvdrg" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.011152 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvdrg" event={"ID":"ed72001c-10a1-40ed-a379-bc416583cc76","Type":"ContainerDied","Data":"e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e"} Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.012068 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xvdrg" event={"ID":"ed72001c-10a1-40ed-a379-bc416583cc76","Type":"ContainerDied","Data":"85ee8b336e512bf82aafed302f03d5e414eaf88c8bc1605d3825815d4908f577"} Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.012095 5127 scope.go:117] "RemoveContainer" containerID="e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.045777 5127 scope.go:117] "RemoveContainer" containerID="5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.051261 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xvdrg"] Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.057531 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xvdrg"] Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.070364 5127 scope.go:117] "RemoveContainer" containerID="c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.103326 5127 scope.go:117] "RemoveContainer" containerID="e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e" Feb 01 08:33:53 crc kubenswrapper[5127]: E0201 08:33:53.103896 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e\": container with ID starting with e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e not found: ID does not exist" containerID="e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.103949 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e"} err="failed to get container status \"e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e\": rpc error: code = NotFound desc = could not find container \"e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e\": container with ID starting with e272ecbf77887bb406dd3b7090db02e2ae79b2293ef3d6a149501f212496110e not found: ID does not exist" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.103973 5127 scope.go:117] "RemoveContainer" containerID="5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20" Feb 01 08:33:53 crc kubenswrapper[5127]: E0201 08:33:53.104464 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20\": container with ID starting with 5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20 not found: ID does not exist" containerID="5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.104496 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20"} err="failed to get container status \"5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20\": rpc error: code = NotFound desc = could not find container \"5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20\": container with ID starting with 5f4d6491c9b1ac62204ece4af8d445d90458e2e550f64d08ee524bd21ef6cf20 not found: ID does not exist" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.104517 5127 scope.go:117] "RemoveContainer" containerID="c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed" Feb 01 08:33:53 crc kubenswrapper[5127]: E0201 08:33:53.104861 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed\": container with ID starting with c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed not found: ID does not exist" containerID="c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed" Feb 01 08:33:53 crc kubenswrapper[5127]: I0201 08:33:53.104926 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed"} err="failed to get container status \"c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed\": rpc error: code = NotFound desc = could not find container \"c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed\": container with ID starting with c561dd09a62835d1a2c72494dfc89321af83ea802fc54ed6ce5d31d57f6880ed not found: ID does not exist" Feb 01 08:33:54 crc kubenswrapper[5127]: I0201 08:33:54.244894 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" path="/var/lib/kubelet/pods/ed72001c-10a1-40ed-a379-bc416583cc76/volumes" Feb 01 08:35:06 crc kubenswrapper[5127]: I0201 08:35:06.741281 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:35:06 crc kubenswrapper[5127]: I0201 08:35:06.742130 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:35:36 crc kubenswrapper[5127]: I0201 08:35:36.741311 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:35:36 crc kubenswrapper[5127]: I0201 08:35:36.742237 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:36:01 crc kubenswrapper[5127]: I0201 08:36:01.068465 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-87k97"] Feb 01 08:36:01 crc kubenswrapper[5127]: I0201 08:36:01.076981 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-87k97"] Feb 01 08:36:02 crc kubenswrapper[5127]: I0201 08:36:02.243779 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fffed9b-6bbc-4acc-b2a2-222991b5b813" path="/var/lib/kubelet/pods/5fffed9b-6bbc-4acc-b2a2-222991b5b813/volumes" Feb 01 08:36:06 crc kubenswrapper[5127]: I0201 08:36:06.741329 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:36:06 crc kubenswrapper[5127]: I0201 08:36:06.742503 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:36:06 crc kubenswrapper[5127]: I0201 08:36:06.742626 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:36:06 crc kubenswrapper[5127]: I0201 08:36:06.743531 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1313965be7bf98e2a15f3d3d8432b5e3945ebbb0fff233b7d904dc276cb91592"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:36:06 crc kubenswrapper[5127]: I0201 08:36:06.743665 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://1313965be7bf98e2a15f3d3d8432b5e3945ebbb0fff233b7d904dc276cb91592" gracePeriod=600 Feb 01 08:36:07 crc kubenswrapper[5127]: I0201 08:36:07.234965 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="1313965be7bf98e2a15f3d3d8432b5e3945ebbb0fff233b7d904dc276cb91592" exitCode=0 Feb 01 08:36:07 crc kubenswrapper[5127]: I0201 08:36:07.235163 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"1313965be7bf98e2a15f3d3d8432b5e3945ebbb0fff233b7d904dc276cb91592"} Feb 01 08:36:07 crc kubenswrapper[5127]: I0201 08:36:07.235248 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d"} Feb 01 08:36:07 crc kubenswrapper[5127]: I0201 08:36:07.235266 5127 scope.go:117] "RemoveContainer" containerID="cfaabf246679f8f566a5214dfceda57f9d4ccfe80028a4475c1f25a3314fb037" Feb 01 08:36:42 crc kubenswrapper[5127]: I0201 08:36:42.484448 5127 scope.go:117] "RemoveContainer" containerID="7a4310277a75f2d4aa63e17e2c467bace1f486d0229a6eb32fd1b6dacc8edb97" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.937123 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 01 08:37:32 crc kubenswrapper[5127]: E0201 08:37:32.938445 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="extract-content" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.938461 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="extract-content" Feb 01 08:37:32 crc kubenswrapper[5127]: E0201 08:37:32.938499 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="extract-utilities" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.938505 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="extract-utilities" Feb 01 08:37:32 crc kubenswrapper[5127]: E0201 08:37:32.938523 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="registry-server" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.938529 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="registry-server" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.938757 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed72001c-10a1-40ed-a379-bc416583cc76" containerName="registry-server" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.939499 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.942254 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hxbhz" Feb 01 08:37:32 crc kubenswrapper[5127]: I0201 08:37:32.950521 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.047629 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlhw7\" (UniqueName: \"kubernetes.io/projected/d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef-kube-api-access-rlhw7\") pod \"mariadb-copy-data\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.047962 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\") pod \"mariadb-copy-data\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.150602 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\") pod \"mariadb-copy-data\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.150781 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlhw7\" (UniqueName: \"kubernetes.io/projected/d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef-kube-api-access-rlhw7\") pod \"mariadb-copy-data\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.156049 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.156127 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\") pod \"mariadb-copy-data\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c26d678a6771e9a2bbb1969493bbee865439257f04600ce4d551b599e3cd519e/globalmount\"" pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.180264 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlhw7\" (UniqueName: \"kubernetes.io/projected/d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef-kube-api-access-rlhw7\") pod \"mariadb-copy-data\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.202503 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\") pod \"mariadb-copy-data\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.274681 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 01 08:37:33 crc kubenswrapper[5127]: I0201 08:37:33.825243 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 01 08:37:34 crc kubenswrapper[5127]: I0201 08:37:34.026011 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef","Type":"ContainerStarted","Data":"4bfbc89aa53e8543a4b32fc0aff3189e74ae543445c14b568152d24bfe27a12c"} Feb 01 08:37:34 crc kubenswrapper[5127]: I0201 08:37:34.026069 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef","Type":"ContainerStarted","Data":"3fff16be94bbc51e5a8d1867978d66f0f2c1c2b9f06154bce866bdac8f663ea1"} Feb 01 08:37:34 crc kubenswrapper[5127]: I0201 08:37:34.050057 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.050024924 podStartE2EDuration="3.050024924s" podCreationTimestamp="2026-02-01 08:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:37:34.043377256 +0000 UTC m=+6604.529279619" watchObservedRunningTime="2026-02-01 08:37:34.050024924 +0000 UTC m=+6604.535927317" Feb 01 08:37:37 crc kubenswrapper[5127]: I0201 08:37:37.370630 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:37 crc kubenswrapper[5127]: I0201 08:37:37.375620 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:37 crc kubenswrapper[5127]: I0201 08:37:37.376380 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:37 crc kubenswrapper[5127]: I0201 08:37:37.520899 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zztcw\" (UniqueName: \"kubernetes.io/projected/d2745abe-7224-49df-af07-ffbbe64d2510-kube-api-access-zztcw\") pod \"mariadb-client\" (UID: \"d2745abe-7224-49df-af07-ffbbe64d2510\") " pod="openstack/mariadb-client" Feb 01 08:37:37 crc kubenswrapper[5127]: I0201 08:37:37.622998 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zztcw\" (UniqueName: \"kubernetes.io/projected/d2745abe-7224-49df-af07-ffbbe64d2510-kube-api-access-zztcw\") pod \"mariadb-client\" (UID: \"d2745abe-7224-49df-af07-ffbbe64d2510\") " pod="openstack/mariadb-client" Feb 01 08:37:37 crc kubenswrapper[5127]: I0201 08:37:37.648250 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zztcw\" (UniqueName: \"kubernetes.io/projected/d2745abe-7224-49df-af07-ffbbe64d2510-kube-api-access-zztcw\") pod \"mariadb-client\" (UID: \"d2745abe-7224-49df-af07-ffbbe64d2510\") " pod="openstack/mariadb-client" Feb 01 08:37:37 crc kubenswrapper[5127]: I0201 08:37:37.712122 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:38 crc kubenswrapper[5127]: I0201 08:37:38.198257 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:39 crc kubenswrapper[5127]: I0201 08:37:39.072170 5127 generic.go:334] "Generic (PLEG): container finished" podID="d2745abe-7224-49df-af07-ffbbe64d2510" containerID="eae1d109b609b9ec9050a8b2e4c3c17c740c313155b20685463e47b4c9bbe08f" exitCode=0 Feb 01 08:37:39 crc kubenswrapper[5127]: I0201 08:37:39.072255 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2745abe-7224-49df-af07-ffbbe64d2510","Type":"ContainerDied","Data":"eae1d109b609b9ec9050a8b2e4c3c17c740c313155b20685463e47b4c9bbe08f"} Feb 01 08:37:39 crc kubenswrapper[5127]: I0201 08:37:39.072311 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2745abe-7224-49df-af07-ffbbe64d2510","Type":"ContainerStarted","Data":"71ac897462bbad0809ab4218eb264dba4b6f7e8f84c44d438f763385a5da5c4f"} Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.437524 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.465824 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_d2745abe-7224-49df-af07-ffbbe64d2510/mariadb-client/0.log" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.492162 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.497358 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.573961 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zztcw\" (UniqueName: \"kubernetes.io/projected/d2745abe-7224-49df-af07-ffbbe64d2510-kube-api-access-zztcw\") pod \"d2745abe-7224-49df-af07-ffbbe64d2510\" (UID: \"d2745abe-7224-49df-af07-ffbbe64d2510\") " Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.580886 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2745abe-7224-49df-af07-ffbbe64d2510-kube-api-access-zztcw" (OuterVolumeSpecName: "kube-api-access-zztcw") pod "d2745abe-7224-49df-af07-ffbbe64d2510" (UID: "d2745abe-7224-49df-af07-ffbbe64d2510"). InnerVolumeSpecName "kube-api-access-zztcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.660128 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:40 crc kubenswrapper[5127]: E0201 08:37:40.661269 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2745abe-7224-49df-af07-ffbbe64d2510" containerName="mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.661313 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2745abe-7224-49df-af07-ffbbe64d2510" containerName="mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.661744 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2745abe-7224-49df-af07-ffbbe64d2510" containerName="mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.662673 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.667302 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.675717 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zztcw\" (UniqueName: \"kubernetes.io/projected/d2745abe-7224-49df-af07-ffbbe64d2510-kube-api-access-zztcw\") on node \"crc\" DevicePath \"\"" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.777651 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mft49\" (UniqueName: \"kubernetes.io/projected/fa1cdc44-3763-4d12-a84b-d183180c9081-kube-api-access-mft49\") pod \"mariadb-client\" (UID: \"fa1cdc44-3763-4d12-a84b-d183180c9081\") " pod="openstack/mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.880365 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mft49\" (UniqueName: \"kubernetes.io/projected/fa1cdc44-3763-4d12-a84b-d183180c9081-kube-api-access-mft49\") pod \"mariadb-client\" (UID: \"fa1cdc44-3763-4d12-a84b-d183180c9081\") " pod="openstack/mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.912863 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mft49\" (UniqueName: \"kubernetes.io/projected/fa1cdc44-3763-4d12-a84b-d183180c9081-kube-api-access-mft49\") pod \"mariadb-client\" (UID: \"fa1cdc44-3763-4d12-a84b-d183180c9081\") " pod="openstack/mariadb-client" Feb 01 08:37:40 crc kubenswrapper[5127]: I0201 08:37:40.989528 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:41 crc kubenswrapper[5127]: I0201 08:37:41.103066 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71ac897462bbad0809ab4218eb264dba4b6f7e8f84c44d438f763385a5da5c4f" Feb 01 08:37:41 crc kubenswrapper[5127]: I0201 08:37:41.103183 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:41 crc kubenswrapper[5127]: I0201 08:37:41.143537 5127 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="d2745abe-7224-49df-af07-ffbbe64d2510" podUID="fa1cdc44-3763-4d12-a84b-d183180c9081" Feb 01 08:37:41 crc kubenswrapper[5127]: I0201 08:37:41.251485 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:41 crc kubenswrapper[5127]: W0201 08:37:41.256207 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa1cdc44_3763_4d12_a84b_d183180c9081.slice/crio-92413c838edad0333298eaf609a873741424675f5d4a4e7c816128d117fba57e WatchSource:0}: Error finding container 92413c838edad0333298eaf609a873741424675f5d4a4e7c816128d117fba57e: Status 404 returned error can't find the container with id 92413c838edad0333298eaf609a873741424675f5d4a4e7c816128d117fba57e Feb 01 08:37:42 crc kubenswrapper[5127]: I0201 08:37:42.116375 5127 generic.go:334] "Generic (PLEG): container finished" podID="fa1cdc44-3763-4d12-a84b-d183180c9081" containerID="69f88a5ce6591eafa4c49d26cc5b2b4edbe4ca8c6c30ecb8a759a81dc7795ffb" exitCode=0 Feb 01 08:37:42 crc kubenswrapper[5127]: I0201 08:37:42.116484 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fa1cdc44-3763-4d12-a84b-d183180c9081","Type":"ContainerDied","Data":"69f88a5ce6591eafa4c49d26cc5b2b4edbe4ca8c6c30ecb8a759a81dc7795ffb"} Feb 01 08:37:42 crc kubenswrapper[5127]: I0201 08:37:42.117063 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"fa1cdc44-3763-4d12-a84b-d183180c9081","Type":"ContainerStarted","Data":"92413c838edad0333298eaf609a873741424675f5d4a4e7c816128d117fba57e"} Feb 01 08:37:42 crc kubenswrapper[5127]: I0201 08:37:42.252782 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2745abe-7224-49df-af07-ffbbe64d2510" path="/var/lib/kubelet/pods/d2745abe-7224-49df-af07-ffbbe64d2510/volumes" Feb 01 08:37:43 crc kubenswrapper[5127]: I0201 08:37:43.474352 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:43 crc kubenswrapper[5127]: I0201 08:37:43.497374 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_fa1cdc44-3763-4d12-a84b-d183180c9081/mariadb-client/0.log" Feb 01 08:37:43 crc kubenswrapper[5127]: I0201 08:37:43.525456 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:43 crc kubenswrapper[5127]: I0201 08:37:43.541908 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 01 08:37:43 crc kubenswrapper[5127]: I0201 08:37:43.624728 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mft49\" (UniqueName: \"kubernetes.io/projected/fa1cdc44-3763-4d12-a84b-d183180c9081-kube-api-access-mft49\") pod \"fa1cdc44-3763-4d12-a84b-d183180c9081\" (UID: \"fa1cdc44-3763-4d12-a84b-d183180c9081\") " Feb 01 08:37:43 crc kubenswrapper[5127]: I0201 08:37:43.633227 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1cdc44-3763-4d12-a84b-d183180c9081-kube-api-access-mft49" (OuterVolumeSpecName: "kube-api-access-mft49") pod "fa1cdc44-3763-4d12-a84b-d183180c9081" (UID: "fa1cdc44-3763-4d12-a84b-d183180c9081"). InnerVolumeSpecName "kube-api-access-mft49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:37:43 crc kubenswrapper[5127]: I0201 08:37:43.726627 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mft49\" (UniqueName: \"kubernetes.io/projected/fa1cdc44-3763-4d12-a84b-d183180c9081-kube-api-access-mft49\") on node \"crc\" DevicePath \"\"" Feb 01 08:37:44 crc kubenswrapper[5127]: I0201 08:37:44.137986 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92413c838edad0333298eaf609a873741424675f5d4a4e7c816128d117fba57e" Feb 01 08:37:44 crc kubenswrapper[5127]: I0201 08:37:44.138099 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 01 08:37:44 crc kubenswrapper[5127]: I0201 08:37:44.257402 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1cdc44-3763-4d12-a84b-d183180c9081" path="/var/lib/kubelet/pods/fa1cdc44-3763-4d12-a84b-d183180c9081/volumes" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.487451 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 08:38:12 crc kubenswrapper[5127]: E0201 08:38:12.488808 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa1cdc44-3763-4d12-a84b-d183180c9081" containerName="mariadb-client" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.488834 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1cdc44-3763-4d12-a84b-d183180c9081" containerName="mariadb-client" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.489184 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa1cdc44-3763-4d12-a84b-d183180c9081" containerName="mariadb-client" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.491140 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.497307 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.497453 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.500306 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-q2tpb" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.511430 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.514465 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.521571 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.533157 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.535369 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.542287 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.548323 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655470 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655567 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ca6150-db6e-4770-88c3-d495682edb2e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655677 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8e5bfb6-7976-4e24-89d4-840cba014b37-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655732 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655788 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655839 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655900 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxrn\" (UniqueName: \"kubernetes.io/projected/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-kube-api-access-mdxrn\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.655952 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656030 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh6v2\" (UniqueName: \"kubernetes.io/projected/f8e5bfb6-7976-4e24-89d4-840cba014b37-kube-api-access-fh6v2\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656102 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8e5bfb6-7976-4e24-89d4-840cba014b37-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656163 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656209 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89ca6150-db6e-4770-88c3-d495682edb2e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656253 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ca6150-db6e-4770-88c3-d495682edb2e-config\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656375 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e5bfb6-7976-4e24-89d4-840cba014b37-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656450 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8e5bfb6-7976-4e24-89d4-840cba014b37-config\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656518 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89ca6150-db6e-4770-88c3-d495682edb2e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656570 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7xt9\" (UniqueName: \"kubernetes.io/projected/89ca6150-db6e-4770-88c3-d495682edb2e-kube-api-access-l7xt9\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.656693 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.660539 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.665933 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.669497 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7dzr8" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.669994 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.670366 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.680093 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.686933 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.690954 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.702120 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.704043 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.718306 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.727618 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758536 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758617 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758649 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758676 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758708 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxrn\" (UniqueName: \"kubernetes.io/projected/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-kube-api-access-mdxrn\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758731 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758769 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br72p\" (UniqueName: \"kubernetes.io/projected/6846c586-1fd9-443a-9317-33037c64e831-kube-api-access-br72p\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758802 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh6v2\" (UniqueName: \"kubernetes.io/projected/f8e5bfb6-7976-4e24-89d4-840cba014b37-kube-api-access-fh6v2\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758828 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nzp\" (UniqueName: \"kubernetes.io/projected/4775eb36-bd2b-417d-8156-2628ece4a87a-kube-api-access-88nzp\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758853 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4775eb36-bd2b-417d-8156-2628ece4a87a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758890 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8e5bfb6-7976-4e24-89d4-840cba014b37-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758923 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758946 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89ca6150-db6e-4770-88c3-d495682edb2e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758966 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ca6150-db6e-4770-88c3-d495682edb2e-config\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.758990 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4775eb36-bd2b-417d-8156-2628ece4a87a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759016 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6846c586-1fd9-443a-9317-33037c64e831-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759068 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e5bfb6-7976-4e24-89d4-840cba014b37-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759103 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8e5bfb6-7976-4e24-89d4-840cba014b37-config\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759127 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6846c586-1fd9-443a-9317-33037c64e831-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759158 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89ca6150-db6e-4770-88c3-d495682edb2e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759179 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7xt9\" (UniqueName: \"kubernetes.io/projected/89ca6150-db6e-4770-88c3-d495682edb2e-kube-api-access-l7xt9\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759200 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6846c586-1fd9-443a-9317-33037c64e831-config\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759231 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759268 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759304 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775eb36-bd2b-417d-8156-2628ece4a87a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759326 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759348 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4775eb36-bd2b-417d-8156-2628ece4a87a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759370 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6846c586-1fd9-443a-9317-33037c64e831-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759393 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ca6150-db6e-4770-88c3-d495682edb2e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.759421 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8e5bfb6-7976-4e24-89d4-840cba014b37-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.760415 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8e5bfb6-7976-4e24-89d4-840cba014b37-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.760825 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8e5bfb6-7976-4e24-89d4-840cba014b37-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.761211 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89ca6150-db6e-4770-88c3-d495682edb2e-config\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.761738 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89ca6150-db6e-4770-88c3-d495682edb2e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.761755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.762156 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.762497 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.762887 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/89ca6150-db6e-4770-88c3-d495682edb2e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.764289 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8e5bfb6-7976-4e24-89d4-840cba014b37-config\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.764330 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.764359 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a479d7d239220216bb04a2ef056e2197d36fd9faae85c13e480fdd37d8d9c153/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.764386 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.764410 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/85f27abcfbb2ce6dc67d53152c9f371ff092e68a26a72b17d73672ac07901400/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.764636 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.764724 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d0105735441665e83c3bd9cf6f88f561dd3af584e28ac869f12f693bc2900be/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.767346 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ca6150-db6e-4770-88c3-d495682edb2e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.767374 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.768656 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e5bfb6-7976-4e24-89d4-840cba014b37-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.778456 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh6v2\" (UniqueName: \"kubernetes.io/projected/f8e5bfb6-7976-4e24-89d4-840cba014b37-kube-api-access-fh6v2\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.779162 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7xt9\" (UniqueName: \"kubernetes.io/projected/89ca6150-db6e-4770-88c3-d495682edb2e-kube-api-access-l7xt9\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.780224 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxrn\" (UniqueName: \"kubernetes.io/projected/ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d-kube-api-access-mdxrn\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.813020 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d0e58fe-9cad-4e18-8ddc-77db78083f06\") pod \"ovsdbserver-nb-1\" (UID: \"89ca6150-db6e-4770-88c3-d495682edb2e\") " pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.815566 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2cd8738f-0d2f-4bce-85e8-3eb158983fed\") pod \"ovsdbserver-nb-2\" (UID: \"f8e5bfb6-7976-4e24-89d4-840cba014b37\") " pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.819982 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3e13a7ff-1e84-4e06-8427-f020cb9d683f\") pod \"ovsdbserver-nb-0\" (UID: \"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d\") " pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.823956 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.855760 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860680 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6846c586-1fd9-443a-9317-33037c64e831-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860738 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6846c586-1fd9-443a-9317-33037c64e831-config\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860776 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860815 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed45656-ff80-4f32-aa27-26223fa85bf5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860855 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775eb36-bd2b-417d-8156-2628ece4a87a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860876 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4775eb36-bd2b-417d-8156-2628ece4a87a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860899 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6846c586-1fd9-443a-9317-33037c64e831-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860924 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4j7n\" (UniqueName: \"kubernetes.io/projected/fed45656-ff80-4f32-aa27-26223fa85bf5-kube-api-access-s4j7n\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860959 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.860989 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed45656-ff80-4f32-aa27-26223fa85bf5-config\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861013 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed45656-ff80-4f32-aa27-26223fa85bf5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861048 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br72p\" (UniqueName: \"kubernetes.io/projected/6846c586-1fd9-443a-9317-33037c64e831-kube-api-access-br72p\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861079 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nzp\" (UniqueName: \"kubernetes.io/projected/4775eb36-bd2b-417d-8156-2628ece4a87a-kube-api-access-88nzp\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861104 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4775eb36-bd2b-417d-8156-2628ece4a87a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861148 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4775eb36-bd2b-417d-8156-2628ece4a87a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861175 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861209 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6846c586-1fd9-443a-9317-33037c64e831-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861251 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed45656-ff80-4f32-aa27-26223fa85bf5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861397 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6846c586-1fd9-443a-9317-33037c64e831-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861775 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4775eb36-bd2b-417d-8156-2628ece4a87a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.861885 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775eb36-bd2b-417d-8156-2628ece4a87a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.862238 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4775eb36-bd2b-417d-8156-2628ece4a87a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.862818 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6846c586-1fd9-443a-9317-33037c64e831-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.863875 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.864008 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2c3ba254aaf19c846c67eb1bb627f3d142dc88e897f42862bf8bb49db277a268/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.864068 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.864209 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/46ded214177db263d7dfe7dcb4c92cda0d325ad2e81ad8a8a8d44c5c8decb62e/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.864788 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6846c586-1fd9-443a-9317-33037c64e831-config\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.868339 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6846c586-1fd9-443a-9317-33037c64e831-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.870180 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4775eb36-bd2b-417d-8156-2628ece4a87a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.875555 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.879454 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br72p\" (UniqueName: \"kubernetes.io/projected/6846c586-1fd9-443a-9317-33037c64e831-kube-api-access-br72p\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.883825 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nzp\" (UniqueName: \"kubernetes.io/projected/4775eb36-bd2b-417d-8156-2628ece4a87a-kube-api-access-88nzp\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.897950 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a51de89f-02dd-4f19-bcf4-f024a4c85f03\") pod \"ovsdbserver-sb-0\" (UID: \"4775eb36-bd2b-417d-8156-2628ece4a87a\") " pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.908859 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24dd98d3-672f-4ee3-a60b-63a3ce2cf2ed\") pod \"ovsdbserver-sb-2\" (UID: \"6846c586-1fd9-443a-9317-33037c64e831\") " pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.962145 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.962200 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed45656-ff80-4f32-aa27-26223fa85bf5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.962252 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed45656-ff80-4f32-aa27-26223fa85bf5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.962285 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4j7n\" (UniqueName: \"kubernetes.io/projected/fed45656-ff80-4f32-aa27-26223fa85bf5-kube-api-access-s4j7n\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.962308 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed45656-ff80-4f32-aa27-26223fa85bf5-config\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.962323 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed45656-ff80-4f32-aa27-26223fa85bf5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.964752 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed45656-ff80-4f32-aa27-26223fa85bf5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.965277 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed45656-ff80-4f32-aa27-26223fa85bf5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.966617 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed45656-ff80-4f32-aa27-26223fa85bf5-config\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.967679 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.967699 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bce05b4e24d7b5f5f1ce21887b29a55a07b26c69bf026dfa003ce15ec9cb85ea/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.972495 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed45656-ff80-4f32-aa27-26223fa85bf5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.986724 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4j7n\" (UniqueName: \"kubernetes.io/projected/fed45656-ff80-4f32-aa27-26223fa85bf5-kube-api-access-s4j7n\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:12 crc kubenswrapper[5127]: I0201 08:38:12.987029 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.018062 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.022611 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4159bb13-2064-4f73-aab9-b137f47f2ede\") pod \"ovsdbserver-sb-1\" (UID: \"fed45656-ff80-4f32-aa27-26223fa85bf5\") " pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.028081 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.383038 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 01 08:38:13 crc kubenswrapper[5127]: W0201 08:38:13.396734 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac5ceef4_919c_4a17_af7e_6ae27bfc5a3d.slice/crio-dbb69d0c149b0f06d75e8703c90e6ab23a01845b705a255d4be2a8cee7ba4293 WatchSource:0}: Error finding container dbb69d0c149b0f06d75e8703c90e6ab23a01845b705a255d4be2a8cee7ba4293: Status 404 returned error can't find the container with id dbb69d0c149b0f06d75e8703c90e6ab23a01845b705a255d4be2a8cee7ba4293 Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.400564 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.471295 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 01 08:38:13 crc kubenswrapper[5127]: W0201 08:38:13.475242 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89ca6150_db6e_4770_88c3_d495682edb2e.slice/crio-01c0592c4b2bcdfe2d5400949377b57a074b0707fd24395d41ea1311078880c2 WatchSource:0}: Error finding container 01c0592c4b2bcdfe2d5400949377b57a074b0707fd24395d41ea1311078880c2: Status 404 returned error can't find the container with id 01c0592c4b2bcdfe2d5400949377b57a074b0707fd24395d41ea1311078880c2 Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.608659 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 01 08:38:13 crc kubenswrapper[5127]: I0201 08:38:13.708234 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 01 08:38:13 crc kubenswrapper[5127]: W0201 08:38:13.710100 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed45656_ff80_4f32_aa27_26223fa85bf5.slice/crio-bb51d4212f9aaad0e9dd94360887ddf4942d670b48b7e994323751120f43003f WatchSource:0}: Error finding container bb51d4212f9aaad0e9dd94360887ddf4942d670b48b7e994323751120f43003f: Status 404 returned error can't find the container with id bb51d4212f9aaad0e9dd94360887ddf4942d670b48b7e994323751120f43003f Feb 01 08:38:14 crc kubenswrapper[5127]: I0201 08:38:14.396871 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fed45656-ff80-4f32-aa27-26223fa85bf5","Type":"ContainerStarted","Data":"bb51d4212f9aaad0e9dd94360887ddf4942d670b48b7e994323751120f43003f"} Feb 01 08:38:14 crc kubenswrapper[5127]: I0201 08:38:14.398887 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d","Type":"ContainerStarted","Data":"dbb69d0c149b0f06d75e8703c90e6ab23a01845b705a255d4be2a8cee7ba4293"} Feb 01 08:38:14 crc kubenswrapper[5127]: I0201 08:38:14.400838 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"6846c586-1fd9-443a-9317-33037c64e831","Type":"ContainerStarted","Data":"2a0baf20f7558de2b618067e455550998f5f7bebd56c9c68d69c532525398c12"} Feb 01 08:38:14 crc kubenswrapper[5127]: I0201 08:38:14.403256 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"89ca6150-db6e-4770-88c3-d495682edb2e","Type":"ContainerStarted","Data":"01c0592c4b2bcdfe2d5400949377b57a074b0707fd24395d41ea1311078880c2"} Feb 01 08:38:14 crc kubenswrapper[5127]: I0201 08:38:14.509964 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 01 08:38:14 crc kubenswrapper[5127]: W0201 08:38:14.521315 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8e5bfb6_7976_4e24_89d4_840cba014b37.slice/crio-18055cb4b9804cd42a80d48e3651e89eb39a798fbf6e1261f8484dfeb3798bd0 WatchSource:0}: Error finding container 18055cb4b9804cd42a80d48e3651e89eb39a798fbf6e1261f8484dfeb3798bd0: Status 404 returned error can't find the container with id 18055cb4b9804cd42a80d48e3651e89eb39a798fbf6e1261f8484dfeb3798bd0 Feb 01 08:38:14 crc kubenswrapper[5127]: I0201 08:38:14.621517 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 01 08:38:14 crc kubenswrapper[5127]: W0201 08:38:14.632031 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4775eb36_bd2b_417d_8156_2628ece4a87a.slice/crio-ef5815a5a9d14833e229f88d4c83b9b32394af1c88488d49f64e5654ee9ec59c WatchSource:0}: Error finding container ef5815a5a9d14833e229f88d4c83b9b32394af1c88488d49f64e5654ee9ec59c: Status 404 returned error can't find the container with id ef5815a5a9d14833e229f88d4c83b9b32394af1c88488d49f64e5654ee9ec59c Feb 01 08:38:15 crc kubenswrapper[5127]: I0201 08:38:15.412437 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4775eb36-bd2b-417d-8156-2628ece4a87a","Type":"ContainerStarted","Data":"ef5815a5a9d14833e229f88d4c83b9b32394af1c88488d49f64e5654ee9ec59c"} Feb 01 08:38:15 crc kubenswrapper[5127]: I0201 08:38:15.413681 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8e5bfb6-7976-4e24-89d4-840cba014b37","Type":"ContainerStarted","Data":"18055cb4b9804cd42a80d48e3651e89eb39a798fbf6e1261f8484dfeb3798bd0"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.437989 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d","Type":"ContainerStarted","Data":"391f09daf2e7b63db434d2bea0704ae625303c1f3a148f30aea0a2672811ff9f"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.438290 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d","Type":"ContainerStarted","Data":"5997f81808970cb8de86f4546f92a8b5b61a9b6f88fa94acc3c7f938b2475ca5"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.441956 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8e5bfb6-7976-4e24-89d4-840cba014b37","Type":"ContainerStarted","Data":"75d3198ee1b4d16106e8aceb5329012eeca1d18c18e5ddf2075534592593aeed"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.441984 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"f8e5bfb6-7976-4e24-89d4-840cba014b37","Type":"ContainerStarted","Data":"c70cc34c369cf40fb9af15fa80828e26b03d5224a45ca02f56d005281d7bffcc"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.444659 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"6846c586-1fd9-443a-9317-33037c64e831","Type":"ContainerStarted","Data":"59113dd409b4787d5e911629727a6dafc8dd63b7f46af88d0943665c8c3e4a2b"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.444712 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"6846c586-1fd9-443a-9317-33037c64e831","Type":"ContainerStarted","Data":"8b24895c2d2077e71aac5c9c7f863abbc0edf8cbfb2fc33b2bda500a56046a9c"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.449480 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"89ca6150-db6e-4770-88c3-d495682edb2e","Type":"ContainerStarted","Data":"fad7106bdf11f8902cd529305915abc078130af689e0bf6ea2b27b0539b5514d"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.449514 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"89ca6150-db6e-4770-88c3-d495682edb2e","Type":"ContainerStarted","Data":"18beac6036d914acb9e0ec7b411326dbfeed7aaeb42db6ceec3821fc2aa86802"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.458689 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fed45656-ff80-4f32-aa27-26223fa85bf5","Type":"ContainerStarted","Data":"1573db4ac6a5df6ce97b8f7aa7bb77521d253f1658ffb3fcfc6c671c615617da"} Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.467419 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.404682283 podStartE2EDuration="7.467361398s" podCreationTimestamp="2026-02-01 08:38:11 +0000 UTC" firstStartedPulling="2026-02-01 08:38:13.400377354 +0000 UTC m=+6643.886279717" lastFinishedPulling="2026-02-01 08:38:17.463056469 +0000 UTC m=+6647.948958832" observedRunningTime="2026-02-01 08:38:18.455604122 +0000 UTC m=+6648.941506485" watchObservedRunningTime="2026-02-01 08:38:18.467361398 +0000 UTC m=+6648.953263782" Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.481183 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.51156866 podStartE2EDuration="7.48115942s" podCreationTimestamp="2026-02-01 08:38:11 +0000 UTC" firstStartedPulling="2026-02-01 08:38:13.478472966 +0000 UTC m=+6643.964375329" lastFinishedPulling="2026-02-01 08:38:17.448063726 +0000 UTC m=+6647.933966089" observedRunningTime="2026-02-01 08:38:18.472643201 +0000 UTC m=+6648.958545584" watchObservedRunningTime="2026-02-01 08:38:18.48115942 +0000 UTC m=+6648.967061793" Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.502231 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.580195281 podStartE2EDuration="7.502209297s" podCreationTimestamp="2026-02-01 08:38:11 +0000 UTC" firstStartedPulling="2026-02-01 08:38:14.524091827 +0000 UTC m=+6645.009994190" lastFinishedPulling="2026-02-01 08:38:17.446105843 +0000 UTC m=+6647.932008206" observedRunningTime="2026-02-01 08:38:18.493228135 +0000 UTC m=+6648.979130498" watchObservedRunningTime="2026-02-01 08:38:18.502209297 +0000 UTC m=+6648.988111670" Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.824612 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.856284 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:18 crc kubenswrapper[5127]: I0201 08:38:18.876464 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:19 crc kubenswrapper[5127]: I0201 08:38:19.018379 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:19 crc kubenswrapper[5127]: I0201 08:38:19.470270 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4775eb36-bd2b-417d-8156-2628ece4a87a","Type":"ContainerStarted","Data":"42d7396d08c86312e8f1173a24e713a9cdb9fa811d4bd050d91903f415422556"} Feb 01 08:38:19 crc kubenswrapper[5127]: I0201 08:38:19.471235 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4775eb36-bd2b-417d-8156-2628ece4a87a","Type":"ContainerStarted","Data":"2a2fa6bac75c6b206e8c940752c51b75f7432aa9c5e1ebcb119dad3dea010177"} Feb 01 08:38:19 crc kubenswrapper[5127]: I0201 08:38:19.473298 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"fed45656-ff80-4f32-aa27-26223fa85bf5","Type":"ContainerStarted","Data":"1e2a729685923ec0dc1e676a844c386c55930e9e4bfb744f311c28dc4fa16918"} Feb 01 08:38:19 crc kubenswrapper[5127]: I0201 08:38:19.506390 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.691835874 podStartE2EDuration="8.506359442s" podCreationTimestamp="2026-02-01 08:38:11 +0000 UTC" firstStartedPulling="2026-02-01 08:38:13.612877231 +0000 UTC m=+6644.098779594" lastFinishedPulling="2026-02-01 08:38:17.427400799 +0000 UTC m=+6647.913303162" observedRunningTime="2026-02-01 08:38:18.518245808 +0000 UTC m=+6649.004148191" watchObservedRunningTime="2026-02-01 08:38:19.506359442 +0000 UTC m=+6649.992261845" Feb 01 08:38:19 crc kubenswrapper[5127]: I0201 08:38:19.507529 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.473822309 podStartE2EDuration="8.507517974s" podCreationTimestamp="2026-02-01 08:38:11 +0000 UTC" firstStartedPulling="2026-02-01 08:38:14.635308469 +0000 UTC m=+6645.121210832" lastFinishedPulling="2026-02-01 08:38:18.669004134 +0000 UTC m=+6649.154906497" observedRunningTime="2026-02-01 08:38:19.495330265 +0000 UTC m=+6649.981232658" watchObservedRunningTime="2026-02-01 08:38:19.507517974 +0000 UTC m=+6649.993420367" Feb 01 08:38:19 crc kubenswrapper[5127]: I0201 08:38:19.521181 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.960327754 podStartE2EDuration="8.521151511s" podCreationTimestamp="2026-02-01 08:38:11 +0000 UTC" firstStartedPulling="2026-02-01 08:38:13.712309586 +0000 UTC m=+6644.198211949" lastFinishedPulling="2026-02-01 08:38:18.273133343 +0000 UTC m=+6648.759035706" observedRunningTime="2026-02-01 08:38:19.516471935 +0000 UTC m=+6650.002374358" watchObservedRunningTime="2026-02-01 08:38:19.521151511 +0000 UTC m=+6650.007053904" Feb 01 08:38:21 crc kubenswrapper[5127]: I0201 08:38:21.885124 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:21 crc kubenswrapper[5127]: I0201 08:38:21.886274 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:21 crc kubenswrapper[5127]: I0201 08:38:21.932730 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:21 crc kubenswrapper[5127]: I0201 08:38:21.933221 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:21 crc kubenswrapper[5127]: I0201 08:38:21.935318 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:21 crc kubenswrapper[5127]: I0201 08:38:21.935525 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:21 crc kubenswrapper[5127]: I0201 08:38:21.988427 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.029439 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.037881 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.065707 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.066302 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.071592 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.128601 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.386120 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c5996dc5-p6ltm"] Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.387371 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.390615 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.421673 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c5996dc5-p6ltm"] Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.500002 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.500262 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.539566 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.543275 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbnp\" (UniqueName: \"kubernetes.io/projected/55e7a9f3-e987-48bd-b035-abca8edc61d2-kube-api-access-tnbnp\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.543335 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-config\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.543405 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-dns-svc\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.543867 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.543963 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.551619 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.645680 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbnp\" (UniqueName: \"kubernetes.io/projected/55e7a9f3-e987-48bd-b035-abca8edc61d2-kube-api-access-tnbnp\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.645733 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-config\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.645776 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-dns-svc\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.645937 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.647140 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.647610 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-dns-svc\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.647915 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-config\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.678940 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbnp\" (UniqueName: \"kubernetes.io/projected/55e7a9f3-e987-48bd-b035-abca8edc61d2-kube-api-access-tnbnp\") pod \"dnsmasq-dns-65c5996dc5-p6ltm\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.728466 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:22 crc kubenswrapper[5127]: I0201 08:38:22.982104 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c5996dc5-p6ltm"] Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.031060 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb"] Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.032942 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.040468 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.056121 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb"] Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.134173 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.170004 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-config\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.170139 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.170165 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.170188 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8bvs\" (UniqueName: \"kubernetes.io/projected/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-kube-api-access-n8bvs\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.170211 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-dns-svc\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.272148 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.272209 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.272238 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8bvs\" (UniqueName: \"kubernetes.io/projected/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-kube-api-access-n8bvs\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.272274 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-dns-svc\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.272361 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-config\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.272995 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.273279 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-config\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.273799 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.274063 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-dns-svc\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.295809 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8bvs\" (UniqueName: \"kubernetes.io/projected/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-kube-api-access-n8bvs\") pod \"dnsmasq-dns-7bfbc4dfdf-2rhjb\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.378405 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c5996dc5-p6ltm"] Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.407939 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.532647 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" event={"ID":"55e7a9f3-e987-48bd-b035-abca8edc61d2","Type":"ContainerStarted","Data":"c9ec260ddc8c1485e43de577616c5f7dbc13e6d136d59215253e3b84c6e8fcf7"} Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.584598 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 01 08:38:23 crc kubenswrapper[5127]: I0201 08:38:23.877508 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb"] Feb 01 08:38:24 crc kubenswrapper[5127]: I0201 08:38:24.556313 5127 generic.go:334] "Generic (PLEG): container finished" podID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerID="fe88ffa8cb730e58d0b8fe55f69445caed58ca3a6ccd6672c803b5f753b79ebc" exitCode=0 Feb 01 08:38:24 crc kubenswrapper[5127]: I0201 08:38:24.556733 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" event={"ID":"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc","Type":"ContainerDied","Data":"fe88ffa8cb730e58d0b8fe55f69445caed58ca3a6ccd6672c803b5f753b79ebc"} Feb 01 08:38:24 crc kubenswrapper[5127]: I0201 08:38:24.556768 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" event={"ID":"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc","Type":"ContainerStarted","Data":"ff46e559dc4ff2d2720ee697c3925fd62c2bfe5fbd48723a130d9fa206ffb900"} Feb 01 08:38:24 crc kubenswrapper[5127]: I0201 08:38:24.559975 5127 generic.go:334] "Generic (PLEG): container finished" podID="55e7a9f3-e987-48bd-b035-abca8edc61d2" containerID="d6b87e74811f74fc6fe4308a09874a924b24a5d92c3fb61f547d1c5d2590d9bd" exitCode=0 Feb 01 08:38:24 crc kubenswrapper[5127]: I0201 08:38:24.560050 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" event={"ID":"55e7a9f3-e987-48bd-b035-abca8edc61d2","Type":"ContainerDied","Data":"d6b87e74811f74fc6fe4308a09874a924b24a5d92c3fb61f547d1c5d2590d9bd"} Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.017740 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.109621 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-config\") pod \"55e7a9f3-e987-48bd-b035-abca8edc61d2\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.109672 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnbnp\" (UniqueName: \"kubernetes.io/projected/55e7a9f3-e987-48bd-b035-abca8edc61d2-kube-api-access-tnbnp\") pod \"55e7a9f3-e987-48bd-b035-abca8edc61d2\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.109706 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-ovsdbserver-sb\") pod \"55e7a9f3-e987-48bd-b035-abca8edc61d2\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.109728 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-dns-svc\") pod \"55e7a9f3-e987-48bd-b035-abca8edc61d2\" (UID: \"55e7a9f3-e987-48bd-b035-abca8edc61d2\") " Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.115845 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55e7a9f3-e987-48bd-b035-abca8edc61d2-kube-api-access-tnbnp" (OuterVolumeSpecName: "kube-api-access-tnbnp") pod "55e7a9f3-e987-48bd-b035-abca8edc61d2" (UID: "55e7a9f3-e987-48bd-b035-abca8edc61d2"). InnerVolumeSpecName "kube-api-access-tnbnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.132976 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55e7a9f3-e987-48bd-b035-abca8edc61d2" (UID: "55e7a9f3-e987-48bd-b035-abca8edc61d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.144772 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55e7a9f3-e987-48bd-b035-abca8edc61d2" (UID: "55e7a9f3-e987-48bd-b035-abca8edc61d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.144883 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-config" (OuterVolumeSpecName: "config") pod "55e7a9f3-e987-48bd-b035-abca8edc61d2" (UID: "55e7a9f3-e987-48bd-b035-abca8edc61d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.211035 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.211062 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnbnp\" (UniqueName: \"kubernetes.io/projected/55e7a9f3-e987-48bd-b035-abca8edc61d2-kube-api-access-tnbnp\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.211073 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.211083 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55e7a9f3-e987-48bd-b035-abca8edc61d2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.572520 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" event={"ID":"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc","Type":"ContainerStarted","Data":"f589733d1ebb478bc2ec33963597af99a2bba63f99508fab18182f9c1d179f38"} Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.572832 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.575034 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" event={"ID":"55e7a9f3-e987-48bd-b035-abca8edc61d2","Type":"ContainerDied","Data":"c9ec260ddc8c1485e43de577616c5f7dbc13e6d136d59215253e3b84c6e8fcf7"} Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.575078 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5996dc5-p6ltm" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.575543 5127 scope.go:117] "RemoveContainer" containerID="d6b87e74811f74fc6fe4308a09874a924b24a5d92c3fb61f547d1c5d2590d9bd" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.603070 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" podStartSLOduration=3.603048991 podStartE2EDuration="3.603048991s" podCreationTimestamp="2026-02-01 08:38:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:38:25.593330359 +0000 UTC m=+6656.079232722" watchObservedRunningTime="2026-02-01 08:38:25.603048991 +0000 UTC m=+6656.088951344" Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.653081 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c5996dc5-p6ltm"] Feb 01 08:38:25 crc kubenswrapper[5127]: I0201 08:38:25.660205 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c5996dc5-p6ltm"] Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.030639 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 01 08:38:26 crc kubenswrapper[5127]: E0201 08:38:26.031301 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55e7a9f3-e987-48bd-b035-abca8edc61d2" containerName="init" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.031349 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="55e7a9f3-e987-48bd-b035-abca8edc61d2" containerName="init" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.031786 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="55e7a9f3-e987-48bd-b035-abca8edc61d2" containerName="init" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.033150 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.035086 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.047955 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.123318 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r674h\" (UniqueName: \"kubernetes.io/projected/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-kube-api-access-r674h\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.123446 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.123470 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.224931 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.224982 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.225091 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r674h\" (UniqueName: \"kubernetes.io/projected/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-kube-api-access-r674h\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.229576 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.229677 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/927d4b060ba5222e626c2db172c9120ab4a39c48847ca0b2774d1840496fe9b9/globalmount\"" pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.230091 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.247430 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55e7a9f3-e987-48bd-b035-abca8edc61d2" path="/var/lib/kubelet/pods/55e7a9f3-e987-48bd-b035-abca8edc61d2/volumes" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.258015 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r674h\" (UniqueName: \"kubernetes.io/projected/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-kube-api-access-r674h\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.270154 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\") pod \"ovn-copy-data\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.359282 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 01 08:38:26 crc kubenswrapper[5127]: I0201 08:38:26.959976 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 01 08:38:26 crc kubenswrapper[5127]: W0201 08:38:26.962932 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56e811b5_c18f_4f40_8ef6_bd6a2f35a62b.slice/crio-eececdd3e88f9e5870fe476afa5022c3cf4a1cd1c33ce566c5ce427378bece76 WatchSource:0}: Error finding container eececdd3e88f9e5870fe476afa5022c3cf4a1cd1c33ce566c5ce427378bece76: Status 404 returned error can't find the container with id eececdd3e88f9e5870fe476afa5022c3cf4a1cd1c33ce566c5ce427378bece76 Feb 01 08:38:27 crc kubenswrapper[5127]: I0201 08:38:27.607158 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b","Type":"ContainerStarted","Data":"a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b"} Feb 01 08:38:27 crc kubenswrapper[5127]: I0201 08:38:27.607577 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b","Type":"ContainerStarted","Data":"eececdd3e88f9e5870fe476afa5022c3cf4a1cd1c33ce566c5ce427378bece76"} Feb 01 08:38:27 crc kubenswrapper[5127]: I0201 08:38:27.629430 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.276969787 podStartE2EDuration="2.629409379s" podCreationTimestamp="2026-02-01 08:38:25 +0000 UTC" firstStartedPulling="2026-02-01 08:38:26.964900811 +0000 UTC m=+6657.450803174" lastFinishedPulling="2026-02-01 08:38:27.317340373 +0000 UTC m=+6657.803242766" observedRunningTime="2026-02-01 08:38:27.626605384 +0000 UTC m=+6658.112507777" watchObservedRunningTime="2026-02-01 08:38:27.629409379 +0000 UTC m=+6658.115311752" Feb 01 08:38:33 crc kubenswrapper[5127]: I0201 08:38:33.410467 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:38:33 crc kubenswrapper[5127]: I0201 08:38:33.494108 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cbb469cd9-wc4fv"] Feb 01 08:38:33 crc kubenswrapper[5127]: I0201 08:38:33.494385 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" podUID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerName="dnsmasq-dns" containerID="cri-o://da956daa74f03cc11ce83ba491f3d1510f66b6ffa98a50c2a819e6e34b48fa6b" gracePeriod=10 Feb 01 08:38:33 crc kubenswrapper[5127]: I0201 08:38:33.663963 5127 generic.go:334] "Generic (PLEG): container finished" podID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerID="da956daa74f03cc11ce83ba491f3d1510f66b6ffa98a50c2a819e6e34b48fa6b" exitCode=0 Feb 01 08:38:33 crc kubenswrapper[5127]: I0201 08:38:33.664048 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" event={"ID":"6b4b6f0f-e6e8-40a8-befa-73332751b5a1","Type":"ContainerDied","Data":"da956daa74f03cc11ce83ba491f3d1510f66b6ffa98a50c2a819e6e34b48fa6b"} Feb 01 08:38:33 crc kubenswrapper[5127]: I0201 08:38:33.966724 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.061050 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-dns-svc\") pod \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.061418 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lxdh\" (UniqueName: \"kubernetes.io/projected/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-kube-api-access-5lxdh\") pod \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.061513 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-config\") pod \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\" (UID: \"6b4b6f0f-e6e8-40a8-befa-73332751b5a1\") " Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.068336 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-kube-api-access-5lxdh" (OuterVolumeSpecName: "kube-api-access-5lxdh") pod "6b4b6f0f-e6e8-40a8-befa-73332751b5a1" (UID: "6b4b6f0f-e6e8-40a8-befa-73332751b5a1"). InnerVolumeSpecName "kube-api-access-5lxdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.112053 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b4b6f0f-e6e8-40a8-befa-73332751b5a1" (UID: "6b4b6f0f-e6e8-40a8-befa-73332751b5a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.127386 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-config" (OuterVolumeSpecName: "config") pod "6b4b6f0f-e6e8-40a8-befa-73332751b5a1" (UID: "6b4b6f0f-e6e8-40a8-befa-73332751b5a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.162918 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.162959 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lxdh\" (UniqueName: \"kubernetes.io/projected/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-kube-api-access-5lxdh\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.162976 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b4b6f0f-e6e8-40a8-befa-73332751b5a1-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.674910 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" event={"ID":"6b4b6f0f-e6e8-40a8-befa-73332751b5a1","Type":"ContainerDied","Data":"284eaf68a1007be940d00d74aa173e4d5a69ac8d8ad52320d569fb2dd1f92a91"} Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.675016 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cbb469cd9-wc4fv" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.675022 5127 scope.go:117] "RemoveContainer" containerID="da956daa74f03cc11ce83ba491f3d1510f66b6ffa98a50c2a819e6e34b48fa6b" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.699173 5127 scope.go:117] "RemoveContainer" containerID="36e54476cffe04c5394ab0e8f6b3dbf258f9c6c320142132373db5cf55546582" Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.701525 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cbb469cd9-wc4fv"] Feb 01 08:38:34 crc kubenswrapper[5127]: I0201 08:38:34.708224 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cbb469cd9-wc4fv"] Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.238841 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 01 08:38:35 crc kubenswrapper[5127]: E0201 08:38:35.249440 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerName="init" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.249482 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerName="init" Feb 01 08:38:35 crc kubenswrapper[5127]: E0201 08:38:35.249495 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerName="dnsmasq-dns" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.249502 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerName="dnsmasq-dns" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.250015 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" containerName="dnsmasq-dns" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.273834 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.277101 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.278592 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.296347 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ktq6k" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.306336 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.385011 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.385081 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbxgf\" (UniqueName: \"kubernetes.io/projected/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-kube-api-access-mbxgf\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.385124 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-scripts\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.385149 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-config\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.385191 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.486893 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.486984 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbxgf\" (UniqueName: \"kubernetes.io/projected/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-kube-api-access-mbxgf\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.487031 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-scripts\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.487058 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-config\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.487099 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.487949 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.488479 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-config\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.488951 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-scripts\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.501818 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.514801 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbxgf\" (UniqueName: \"kubernetes.io/projected/eda4fd4c-e6bd-44ab-8790-d65b2e2054a6-kube-api-access-mbxgf\") pod \"ovn-northd-0\" (UID: \"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6\") " pod="openstack/ovn-northd-0" Feb 01 08:38:35 crc kubenswrapper[5127]: I0201 08:38:35.601898 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 01 08:38:36 crc kubenswrapper[5127]: I0201 08:38:36.100250 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 01 08:38:36 crc kubenswrapper[5127]: W0201 08:38:36.113560 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeda4fd4c_e6bd_44ab_8790_d65b2e2054a6.slice/crio-03776b7c4f5b8317b7afdb2f38f830ed5e9f62da16a68f16d2e6ef78a269f359 WatchSource:0}: Error finding container 03776b7c4f5b8317b7afdb2f38f830ed5e9f62da16a68f16d2e6ef78a269f359: Status 404 returned error can't find the container with id 03776b7c4f5b8317b7afdb2f38f830ed5e9f62da16a68f16d2e6ef78a269f359 Feb 01 08:38:36 crc kubenswrapper[5127]: I0201 08:38:36.244251 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4b6f0f-e6e8-40a8-befa-73332751b5a1" path="/var/lib/kubelet/pods/6b4b6f0f-e6e8-40a8-befa-73332751b5a1/volumes" Feb 01 08:38:36 crc kubenswrapper[5127]: I0201 08:38:36.698506 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6","Type":"ContainerStarted","Data":"03776b7c4f5b8317b7afdb2f38f830ed5e9f62da16a68f16d2e6ef78a269f359"} Feb 01 08:38:36 crc kubenswrapper[5127]: I0201 08:38:36.743247 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:38:36 crc kubenswrapper[5127]: I0201 08:38:36.743327 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:38:37 crc kubenswrapper[5127]: I0201 08:38:37.710876 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6","Type":"ContainerStarted","Data":"9490f1a1ff084aa96e8bad2b1a6c42c81fbb66849155747eb280cb7599ca4e7a"} Feb 01 08:38:37 crc kubenswrapper[5127]: I0201 08:38:37.711154 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"eda4fd4c-e6bd-44ab-8790-d65b2e2054a6","Type":"ContainerStarted","Data":"60aababf3b2fa73ceff27b1c8cf2040ea1ff5776ea2c335009bd3dcf637249e9"} Feb 01 08:38:37 crc kubenswrapper[5127]: I0201 08:38:37.711173 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 01 08:38:37 crc kubenswrapper[5127]: I0201 08:38:37.739873 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.062299346 podStartE2EDuration="2.739844925s" podCreationTimestamp="2026-02-01 08:38:35 +0000 UTC" firstStartedPulling="2026-02-01 08:38:36.119047348 +0000 UTC m=+6666.604949711" lastFinishedPulling="2026-02-01 08:38:36.796592927 +0000 UTC m=+6667.282495290" observedRunningTime="2026-02-01 08:38:37.730245927 +0000 UTC m=+6668.216148330" watchObservedRunningTime="2026-02-01 08:38:37.739844925 +0000 UTC m=+6668.225747318" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.138012 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-sqpmb"] Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.140126 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.148785 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b02d-account-create-update-vnwlv"] Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.149906 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.153872 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.158915 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sqpmb"] Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.166435 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b02d-account-create-update-vnwlv"] Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.245971 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8142c78-3373-4279-b762-933b0d61711b-operator-scripts\") pod \"keystone-b02d-account-create-update-vnwlv\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.246137 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5127f0eb-21e9-4559-b744-57e7ad40df33-operator-scripts\") pod \"keystone-db-create-sqpmb\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.246339 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl9l8\" (UniqueName: \"kubernetes.io/projected/c8142c78-3373-4279-b762-933b0d61711b-kube-api-access-sl9l8\") pod \"keystone-b02d-account-create-update-vnwlv\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.246375 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkvgn\" (UniqueName: \"kubernetes.io/projected/5127f0eb-21e9-4559-b744-57e7ad40df33-kube-api-access-qkvgn\") pod \"keystone-db-create-sqpmb\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.348177 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl9l8\" (UniqueName: \"kubernetes.io/projected/c8142c78-3373-4279-b762-933b0d61711b-kube-api-access-sl9l8\") pod \"keystone-b02d-account-create-update-vnwlv\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.348257 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkvgn\" (UniqueName: \"kubernetes.io/projected/5127f0eb-21e9-4559-b744-57e7ad40df33-kube-api-access-qkvgn\") pod \"keystone-db-create-sqpmb\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.348314 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8142c78-3373-4279-b762-933b0d61711b-operator-scripts\") pod \"keystone-b02d-account-create-update-vnwlv\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.348347 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5127f0eb-21e9-4559-b744-57e7ad40df33-operator-scripts\") pod \"keystone-db-create-sqpmb\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.349314 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5127f0eb-21e9-4559-b744-57e7ad40df33-operator-scripts\") pod \"keystone-db-create-sqpmb\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.349475 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8142c78-3373-4279-b762-933b0d61711b-operator-scripts\") pod \"keystone-b02d-account-create-update-vnwlv\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.372988 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl9l8\" (UniqueName: \"kubernetes.io/projected/c8142c78-3373-4279-b762-933b0d61711b-kube-api-access-sl9l8\") pod \"keystone-b02d-account-create-update-vnwlv\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.373133 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkvgn\" (UniqueName: \"kubernetes.io/projected/5127f0eb-21e9-4559-b744-57e7ad40df33-kube-api-access-qkvgn\") pod \"keystone-db-create-sqpmb\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.463920 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.473123 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:43 crc kubenswrapper[5127]: I0201 08:38:43.935928 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-sqpmb"] Feb 01 08:38:44 crc kubenswrapper[5127]: I0201 08:38:44.082629 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b02d-account-create-update-vnwlv"] Feb 01 08:38:44 crc kubenswrapper[5127]: W0201 08:38:44.091363 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8142c78_3373_4279_b762_933b0d61711b.slice/crio-0bcd6a0e911ebfbe0da8805e4030da77dcfd2f6f03dadea11a97dd968985b626 WatchSource:0}: Error finding container 0bcd6a0e911ebfbe0da8805e4030da77dcfd2f6f03dadea11a97dd968985b626: Status 404 returned error can't find the container with id 0bcd6a0e911ebfbe0da8805e4030da77dcfd2f6f03dadea11a97dd968985b626 Feb 01 08:38:44 crc kubenswrapper[5127]: I0201 08:38:44.811419 5127 generic.go:334] "Generic (PLEG): container finished" podID="c8142c78-3373-4279-b762-933b0d61711b" containerID="0f988b660e823ed3c10669a713f638558d99d8476f293d13703a52c5d896bc54" exitCode=0 Feb 01 08:38:44 crc kubenswrapper[5127]: I0201 08:38:44.811529 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b02d-account-create-update-vnwlv" event={"ID":"c8142c78-3373-4279-b762-933b0d61711b","Type":"ContainerDied","Data":"0f988b660e823ed3c10669a713f638558d99d8476f293d13703a52c5d896bc54"} Feb 01 08:38:44 crc kubenswrapper[5127]: I0201 08:38:44.812011 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b02d-account-create-update-vnwlv" event={"ID":"c8142c78-3373-4279-b762-933b0d61711b","Type":"ContainerStarted","Data":"0bcd6a0e911ebfbe0da8805e4030da77dcfd2f6f03dadea11a97dd968985b626"} Feb 01 08:38:44 crc kubenswrapper[5127]: I0201 08:38:44.815490 5127 generic.go:334] "Generic (PLEG): container finished" podID="5127f0eb-21e9-4559-b744-57e7ad40df33" containerID="72ca46e50733b0f6f0afbaabbebaeb930db3c6653e35135bde5f232bc45c05de" exitCode=0 Feb 01 08:38:44 crc kubenswrapper[5127]: I0201 08:38:44.815579 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sqpmb" event={"ID":"5127f0eb-21e9-4559-b744-57e7ad40df33","Type":"ContainerDied","Data":"72ca46e50733b0f6f0afbaabbebaeb930db3c6653e35135bde5f232bc45c05de"} Feb 01 08:38:44 crc kubenswrapper[5127]: I0201 08:38:44.815664 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sqpmb" event={"ID":"5127f0eb-21e9-4559-b744-57e7ad40df33","Type":"ContainerStarted","Data":"8894c639af0a2fb2242210da3f99edeac4763d36b84e45c9f764ea1cad29a14c"} Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.238512 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.247503 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.314974 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5127f0eb-21e9-4559-b744-57e7ad40df33-operator-scripts\") pod \"5127f0eb-21e9-4559-b744-57e7ad40df33\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.315084 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8142c78-3373-4279-b762-933b0d61711b-operator-scripts\") pod \"c8142c78-3373-4279-b762-933b0d61711b\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.315214 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl9l8\" (UniqueName: \"kubernetes.io/projected/c8142c78-3373-4279-b762-933b0d61711b-kube-api-access-sl9l8\") pod \"c8142c78-3373-4279-b762-933b0d61711b\" (UID: \"c8142c78-3373-4279-b762-933b0d61711b\") " Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.315292 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkvgn\" (UniqueName: \"kubernetes.io/projected/5127f0eb-21e9-4559-b744-57e7ad40df33-kube-api-access-qkvgn\") pod \"5127f0eb-21e9-4559-b744-57e7ad40df33\" (UID: \"5127f0eb-21e9-4559-b744-57e7ad40df33\") " Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.316688 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5127f0eb-21e9-4559-b744-57e7ad40df33-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5127f0eb-21e9-4559-b744-57e7ad40df33" (UID: "5127f0eb-21e9-4559-b744-57e7ad40df33"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.317140 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8142c78-3373-4279-b762-933b0d61711b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8142c78-3373-4279-b762-933b0d61711b" (UID: "c8142c78-3373-4279-b762-933b0d61711b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.325473 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8142c78-3373-4279-b762-933b0d61711b-kube-api-access-sl9l8" (OuterVolumeSpecName: "kube-api-access-sl9l8") pod "c8142c78-3373-4279-b762-933b0d61711b" (UID: "c8142c78-3373-4279-b762-933b0d61711b"). InnerVolumeSpecName "kube-api-access-sl9l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.326675 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5127f0eb-21e9-4559-b744-57e7ad40df33-kube-api-access-qkvgn" (OuterVolumeSpecName: "kube-api-access-qkvgn") pod "5127f0eb-21e9-4559-b744-57e7ad40df33" (UID: "5127f0eb-21e9-4559-b744-57e7ad40df33"). InnerVolumeSpecName "kube-api-access-qkvgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.417600 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5127f0eb-21e9-4559-b744-57e7ad40df33-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.417650 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8142c78-3373-4279-b762-933b0d61711b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.417667 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl9l8\" (UniqueName: \"kubernetes.io/projected/c8142c78-3373-4279-b762-933b0d61711b-kube-api-access-sl9l8\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.417681 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkvgn\" (UniqueName: \"kubernetes.io/projected/5127f0eb-21e9-4559-b744-57e7ad40df33-kube-api-access-qkvgn\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.833826 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-sqpmb" event={"ID":"5127f0eb-21e9-4559-b744-57e7ad40df33","Type":"ContainerDied","Data":"8894c639af0a2fb2242210da3f99edeac4763d36b84e45c9f764ea1cad29a14c"} Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.833875 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8894c639af0a2fb2242210da3f99edeac4763d36b84e45c9f764ea1cad29a14c" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.833899 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-sqpmb" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.836862 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b02d-account-create-update-vnwlv" event={"ID":"c8142c78-3373-4279-b762-933b0d61711b","Type":"ContainerDied","Data":"0bcd6a0e911ebfbe0da8805e4030da77dcfd2f6f03dadea11a97dd968985b626"} Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.836930 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bcd6a0e911ebfbe0da8805e4030da77dcfd2f6f03dadea11a97dd968985b626" Feb 01 08:38:46 crc kubenswrapper[5127]: I0201 08:38:46.837183 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b02d-account-create-update-vnwlv" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.718695 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-m5j5j"] Feb 01 08:38:48 crc kubenswrapper[5127]: E0201 08:38:48.719404 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8142c78-3373-4279-b762-933b0d61711b" containerName="mariadb-account-create-update" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.719421 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8142c78-3373-4279-b762-933b0d61711b" containerName="mariadb-account-create-update" Feb 01 08:38:48 crc kubenswrapper[5127]: E0201 08:38:48.719434 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5127f0eb-21e9-4559-b744-57e7ad40df33" containerName="mariadb-database-create" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.719443 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5127f0eb-21e9-4559-b744-57e7ad40df33" containerName="mariadb-database-create" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.719609 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8142c78-3373-4279-b762-933b0d61711b" containerName="mariadb-account-create-update" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.719624 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5127f0eb-21e9-4559-b744-57e7ad40df33" containerName="mariadb-database-create" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.720244 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.723488 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.723778 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vmw2l" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.723963 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.724090 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.729480 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m5j5j"] Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.859519 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9g8l\" (UniqueName: \"kubernetes.io/projected/8bbdf245-3582-4862-b9e6-5551b459025b-kube-api-access-g9g8l\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.859654 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-combined-ca-bundle\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.859692 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-config-data\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.961901 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9g8l\" (UniqueName: \"kubernetes.io/projected/8bbdf245-3582-4862-b9e6-5551b459025b-kube-api-access-g9g8l\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.962049 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-combined-ca-bundle\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.962089 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-config-data\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.969837 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-combined-ca-bundle\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.976205 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-config-data\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:48 crc kubenswrapper[5127]: I0201 08:38:48.982310 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9g8l\" (UniqueName: \"kubernetes.io/projected/8bbdf245-3582-4862-b9e6-5551b459025b-kube-api-access-g9g8l\") pod \"keystone-db-sync-m5j5j\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:49 crc kubenswrapper[5127]: I0201 08:38:49.039110 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:49 crc kubenswrapper[5127]: I0201 08:38:49.500532 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-m5j5j"] Feb 01 08:38:49 crc kubenswrapper[5127]: W0201 08:38:49.513295 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bbdf245_3582_4862_b9e6_5551b459025b.slice/crio-c0fa1f5822da8c26fed2573928d0278faaea2861de703d8eb8bcbaa9800a1da8 WatchSource:0}: Error finding container c0fa1f5822da8c26fed2573928d0278faaea2861de703d8eb8bcbaa9800a1da8: Status 404 returned error can't find the container with id c0fa1f5822da8c26fed2573928d0278faaea2861de703d8eb8bcbaa9800a1da8 Feb 01 08:38:49 crc kubenswrapper[5127]: I0201 08:38:49.870018 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m5j5j" event={"ID":"8bbdf245-3582-4862-b9e6-5551b459025b","Type":"ContainerStarted","Data":"c0fa1f5822da8c26fed2573928d0278faaea2861de703d8eb8bcbaa9800a1da8"} Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.357554 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-knggn"] Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.365686 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.369801 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knggn"] Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.465249 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz64c\" (UniqueName: \"kubernetes.io/projected/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-kube-api-access-mz64c\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.465344 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-utilities\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.465675 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-catalog-content\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.567365 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz64c\" (UniqueName: \"kubernetes.io/projected/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-kube-api-access-mz64c\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.567435 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-utilities\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.567486 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-catalog-content\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.568021 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-catalog-content\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.568201 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-utilities\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.590077 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz64c\" (UniqueName: \"kubernetes.io/projected/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-kube-api-access-mz64c\") pod \"community-operators-knggn\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:54 crc kubenswrapper[5127]: I0201 08:38:54.695269 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:38:55 crc kubenswrapper[5127]: I0201 08:38:55.686686 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 01 08:38:56 crc kubenswrapper[5127]: I0201 08:38:56.370466 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-knggn"] Feb 01 08:38:56 crc kubenswrapper[5127]: I0201 08:38:56.934205 5127 generic.go:334] "Generic (PLEG): container finished" podID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerID="1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c" exitCode=0 Feb 01 08:38:56 crc kubenswrapper[5127]: I0201 08:38:56.934689 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knggn" event={"ID":"9f7081cc-bc09-48d6-a1d4-57c6be28f75b","Type":"ContainerDied","Data":"1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c"} Feb 01 08:38:56 crc kubenswrapper[5127]: I0201 08:38:56.934763 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knggn" event={"ID":"9f7081cc-bc09-48d6-a1d4-57c6be28f75b","Type":"ContainerStarted","Data":"20fa1d23aa4988c7432df2fa443f821104fa4e09b83ee9009013d79a58ee5866"} Feb 01 08:38:56 crc kubenswrapper[5127]: I0201 08:38:56.936278 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m5j5j" event={"ID":"8bbdf245-3582-4862-b9e6-5551b459025b","Type":"ContainerStarted","Data":"4435ef192242c7aad77f3220df4e309299a7f14399017092aa5e357f780ff509"} Feb 01 08:38:56 crc kubenswrapper[5127]: I0201 08:38:56.986943 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-m5j5j" podStartSLOduration=2.5712245559999998 podStartE2EDuration="8.986908197s" podCreationTimestamp="2026-02-01 08:38:48 +0000 UTC" firstStartedPulling="2026-02-01 08:38:49.516843198 +0000 UTC m=+6680.002745601" lastFinishedPulling="2026-02-01 08:38:55.932526879 +0000 UTC m=+6686.418429242" observedRunningTime="2026-02-01 08:38:56.982472437 +0000 UTC m=+6687.468374830" watchObservedRunningTime="2026-02-01 08:38:56.986908197 +0000 UTC m=+6687.472810590" Feb 01 08:38:57 crc kubenswrapper[5127]: I0201 08:38:57.946937 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knggn" event={"ID":"9f7081cc-bc09-48d6-a1d4-57c6be28f75b","Type":"ContainerStarted","Data":"3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381"} Feb 01 08:38:57 crc kubenswrapper[5127]: I0201 08:38:57.948739 5127 generic.go:334] "Generic (PLEG): container finished" podID="8bbdf245-3582-4862-b9e6-5551b459025b" containerID="4435ef192242c7aad77f3220df4e309299a7f14399017092aa5e357f780ff509" exitCode=0 Feb 01 08:38:57 crc kubenswrapper[5127]: I0201 08:38:57.948800 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m5j5j" event={"ID":"8bbdf245-3582-4862-b9e6-5551b459025b","Type":"ContainerDied","Data":"4435ef192242c7aad77f3220df4e309299a7f14399017092aa5e357f780ff509"} Feb 01 08:38:58 crc kubenswrapper[5127]: I0201 08:38:58.961762 5127 generic.go:334] "Generic (PLEG): container finished" podID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerID="3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381" exitCode=0 Feb 01 08:38:58 crc kubenswrapper[5127]: I0201 08:38:58.961869 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knggn" event={"ID":"9f7081cc-bc09-48d6-a1d4-57c6be28f75b","Type":"ContainerDied","Data":"3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381"} Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.354222 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.449957 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-combined-ca-bundle\") pod \"8bbdf245-3582-4862-b9e6-5551b459025b\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.450103 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9g8l\" (UniqueName: \"kubernetes.io/projected/8bbdf245-3582-4862-b9e6-5551b459025b-kube-api-access-g9g8l\") pod \"8bbdf245-3582-4862-b9e6-5551b459025b\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.450156 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-config-data\") pod \"8bbdf245-3582-4862-b9e6-5551b459025b\" (UID: \"8bbdf245-3582-4862-b9e6-5551b459025b\") " Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.456239 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bbdf245-3582-4862-b9e6-5551b459025b-kube-api-access-g9g8l" (OuterVolumeSpecName: "kube-api-access-g9g8l") pod "8bbdf245-3582-4862-b9e6-5551b459025b" (UID: "8bbdf245-3582-4862-b9e6-5551b459025b"). InnerVolumeSpecName "kube-api-access-g9g8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.480756 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bbdf245-3582-4862-b9e6-5551b459025b" (UID: "8bbdf245-3582-4862-b9e6-5551b459025b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.492665 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-config-data" (OuterVolumeSpecName: "config-data") pod "8bbdf245-3582-4862-b9e6-5551b459025b" (UID: "8bbdf245-3582-4862-b9e6-5551b459025b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.551769 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9g8l\" (UniqueName: \"kubernetes.io/projected/8bbdf245-3582-4862-b9e6-5551b459025b-kube-api-access-g9g8l\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.552054 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.552068 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bbdf245-3582-4862-b9e6-5551b459025b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.978280 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knggn" event={"ID":"9f7081cc-bc09-48d6-a1d4-57c6be28f75b","Type":"ContainerStarted","Data":"d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33"} Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.981741 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-m5j5j" event={"ID":"8bbdf245-3582-4862-b9e6-5551b459025b","Type":"ContainerDied","Data":"c0fa1f5822da8c26fed2573928d0278faaea2861de703d8eb8bcbaa9800a1da8"} Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.981784 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0fa1f5822da8c26fed2573928d0278faaea2861de703d8eb8bcbaa9800a1da8" Feb 01 08:38:59 crc kubenswrapper[5127]: I0201 08:38:59.981853 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-m5j5j" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.009972 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-knggn" podStartSLOduration=3.606237209 podStartE2EDuration="6.00994556s" podCreationTimestamp="2026-02-01 08:38:54 +0000 UTC" firstStartedPulling="2026-02-01 08:38:56.939930803 +0000 UTC m=+6687.425833196" lastFinishedPulling="2026-02-01 08:38:59.343639174 +0000 UTC m=+6689.829541547" observedRunningTime="2026-02-01 08:39:00.007015872 +0000 UTC m=+6690.492918265" watchObservedRunningTime="2026-02-01 08:39:00.00994556 +0000 UTC m=+6690.495847963" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.224040 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-twvf7"] Feb 01 08:39:00 crc kubenswrapper[5127]: E0201 08:39:00.224675 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bbdf245-3582-4862-b9e6-5551b459025b" containerName="keystone-db-sync" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.224695 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbdf245-3582-4862-b9e6-5551b459025b" containerName="keystone-db-sync" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.224870 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bbdf245-3582-4862-b9e6-5551b459025b" containerName="keystone-db-sync" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.225572 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.232571 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.232630 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.232747 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.232823 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vmw2l" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.232848 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.250790 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-twvf7"] Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.265793 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-config-data\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.265839 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lz7\" (UniqueName: \"kubernetes.io/projected/9388fb80-ae6e-417e-a021-6c34fbd93a5a-kube-api-access-66lz7\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.265899 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-fernet-keys\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.265928 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-scripts\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.265968 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-combined-ca-bundle\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.266017 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-credential-keys\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.266659 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-896bdd75c-j4ptz"] Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.268118 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.276776 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-896bdd75c-j4ptz"] Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.366816 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lz7\" (UniqueName: \"kubernetes.io/projected/9388fb80-ae6e-417e-a021-6c34fbd93a5a-kube-api-access-66lz7\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.366900 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-fernet-keys\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.366965 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-scripts\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.366997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-dns-svc\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.367033 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-combined-ca-bundle\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.367068 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-config\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.367104 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-credential-keys\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.367144 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-config-data\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.367166 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-sb\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.367195 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcqgm\" (UniqueName: \"kubernetes.io/projected/37cdb208-65b0-42ca-b90f-7b7246e58c55-kube-api-access-dcqgm\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.367216 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-nb\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.375029 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-combined-ca-bundle\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.383821 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-credential-keys\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.384655 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-fernet-keys\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.386041 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-scripts\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.386381 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lz7\" (UniqueName: \"kubernetes.io/projected/9388fb80-ae6e-417e-a021-6c34fbd93a5a-kube-api-access-66lz7\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.387609 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-config-data\") pod \"keystone-bootstrap-twvf7\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.468475 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcqgm\" (UniqueName: \"kubernetes.io/projected/37cdb208-65b0-42ca-b90f-7b7246e58c55-kube-api-access-dcqgm\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.468530 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-nb\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.468643 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-dns-svc\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.468688 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-config\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.468770 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-sb\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.469395 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-nb\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.469543 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-dns-svc\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.469570 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-config\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.470291 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-sb\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.494871 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcqgm\" (UniqueName: \"kubernetes.io/projected/37cdb208-65b0-42ca-b90f-7b7246e58c55-kube-api-access-dcqgm\") pod \"dnsmasq-dns-896bdd75c-j4ptz\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.553279 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:00 crc kubenswrapper[5127]: I0201 08:39:00.586027 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:01 crc kubenswrapper[5127]: I0201 08:39:01.077351 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-twvf7"] Feb 01 08:39:01 crc kubenswrapper[5127]: W0201 08:39:01.081942 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9388fb80_ae6e_417e_a021_6c34fbd93a5a.slice/crio-8d5005586c6b9eba5e4e991cbd50e442248aaba9b67ff8c78ad77aebe2196fe5 WatchSource:0}: Error finding container 8d5005586c6b9eba5e4e991cbd50e442248aaba9b67ff8c78ad77aebe2196fe5: Status 404 returned error can't find the container with id 8d5005586c6b9eba5e4e991cbd50e442248aaba9b67ff8c78ad77aebe2196fe5 Feb 01 08:39:01 crc kubenswrapper[5127]: I0201 08:39:01.121951 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-896bdd75c-j4ptz"] Feb 01 08:39:01 crc kubenswrapper[5127]: W0201 08:39:01.124511 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37cdb208_65b0_42ca_b90f_7b7246e58c55.slice/crio-46438fc15cb880365afb319b0c48f4293a008e535bfc03abb4c1106d042474b3 WatchSource:0}: Error finding container 46438fc15cb880365afb319b0c48f4293a008e535bfc03abb4c1106d042474b3: Status 404 returned error can't find the container with id 46438fc15cb880365afb319b0c48f4293a008e535bfc03abb4c1106d042474b3 Feb 01 08:39:02 crc kubenswrapper[5127]: I0201 08:39:02.001826 5127 generic.go:334] "Generic (PLEG): container finished" podID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerID="b18b347e2c927d1858c317854321dda4ed37cbfcb7d93cb3aa516db573eb3134" exitCode=0 Feb 01 08:39:02 crc kubenswrapper[5127]: I0201 08:39:02.001896 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" event={"ID":"37cdb208-65b0-42ca-b90f-7b7246e58c55","Type":"ContainerDied","Data":"b18b347e2c927d1858c317854321dda4ed37cbfcb7d93cb3aa516db573eb3134"} Feb 01 08:39:02 crc kubenswrapper[5127]: I0201 08:39:02.002208 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" event={"ID":"37cdb208-65b0-42ca-b90f-7b7246e58c55","Type":"ContainerStarted","Data":"46438fc15cb880365afb319b0c48f4293a008e535bfc03abb4c1106d042474b3"} Feb 01 08:39:02 crc kubenswrapper[5127]: I0201 08:39:02.005860 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-twvf7" event={"ID":"9388fb80-ae6e-417e-a021-6c34fbd93a5a","Type":"ContainerStarted","Data":"87e22cc031b513a908bd0676ad35878354bc55f93252bec8875ef2c86a1ac889"} Feb 01 08:39:02 crc kubenswrapper[5127]: I0201 08:39:02.005912 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-twvf7" event={"ID":"9388fb80-ae6e-417e-a021-6c34fbd93a5a","Type":"ContainerStarted","Data":"8d5005586c6b9eba5e4e991cbd50e442248aaba9b67ff8c78ad77aebe2196fe5"} Feb 01 08:39:02 crc kubenswrapper[5127]: I0201 08:39:02.047995 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-twvf7" podStartSLOduration=2.047969932 podStartE2EDuration="2.047969932s" podCreationTimestamp="2026-02-01 08:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:39:02.044940951 +0000 UTC m=+6692.530843364" watchObservedRunningTime="2026-02-01 08:39:02.047969932 +0000 UTC m=+6692.533872315" Feb 01 08:39:03 crc kubenswrapper[5127]: I0201 08:39:03.014937 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" event={"ID":"37cdb208-65b0-42ca-b90f-7b7246e58c55","Type":"ContainerStarted","Data":"b9ba7a45f43976276c4e449cd305f5884f96110b8b919c8a267daafd50c7106d"} Feb 01 08:39:03 crc kubenswrapper[5127]: I0201 08:39:03.015278 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:03 crc kubenswrapper[5127]: I0201 08:39:03.046824 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" podStartSLOduration=3.046802116 podStartE2EDuration="3.046802116s" podCreationTimestamp="2026-02-01 08:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:39:03.039835488 +0000 UTC m=+6693.525737871" watchObservedRunningTime="2026-02-01 08:39:03.046802116 +0000 UTC m=+6693.532704489" Feb 01 08:39:04 crc kubenswrapper[5127]: I0201 08:39:04.696061 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:39:04 crc kubenswrapper[5127]: I0201 08:39:04.696377 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:39:04 crc kubenswrapper[5127]: I0201 08:39:04.762369 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:39:05 crc kubenswrapper[5127]: I0201 08:39:05.032575 5127 generic.go:334] "Generic (PLEG): container finished" podID="9388fb80-ae6e-417e-a021-6c34fbd93a5a" containerID="87e22cc031b513a908bd0676ad35878354bc55f93252bec8875ef2c86a1ac889" exitCode=0 Feb 01 08:39:05 crc kubenswrapper[5127]: I0201 08:39:05.032634 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-twvf7" event={"ID":"9388fb80-ae6e-417e-a021-6c34fbd93a5a","Type":"ContainerDied","Data":"87e22cc031b513a908bd0676ad35878354bc55f93252bec8875ef2c86a1ac889"} Feb 01 08:39:05 crc kubenswrapper[5127]: I0201 08:39:05.100878 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.379062 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.485329 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-fernet-keys\") pod \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.485383 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66lz7\" (UniqueName: \"kubernetes.io/projected/9388fb80-ae6e-417e-a021-6c34fbd93a5a-kube-api-access-66lz7\") pod \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.485409 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-combined-ca-bundle\") pod \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.485445 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-credential-keys\") pod \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.485488 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-config-data\") pod \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.485551 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-scripts\") pod \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\" (UID: \"9388fb80-ae6e-417e-a021-6c34fbd93a5a\") " Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.490862 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9388fb80-ae6e-417e-a021-6c34fbd93a5a" (UID: "9388fb80-ae6e-417e-a021-6c34fbd93a5a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.490979 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9388fb80-ae6e-417e-a021-6c34fbd93a5a-kube-api-access-66lz7" (OuterVolumeSpecName: "kube-api-access-66lz7") pod "9388fb80-ae6e-417e-a021-6c34fbd93a5a" (UID: "9388fb80-ae6e-417e-a021-6c34fbd93a5a"). InnerVolumeSpecName "kube-api-access-66lz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.491214 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9388fb80-ae6e-417e-a021-6c34fbd93a5a" (UID: "9388fb80-ae6e-417e-a021-6c34fbd93a5a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.491718 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-scripts" (OuterVolumeSpecName: "scripts") pod "9388fb80-ae6e-417e-a021-6c34fbd93a5a" (UID: "9388fb80-ae6e-417e-a021-6c34fbd93a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.519012 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-config-data" (OuterVolumeSpecName: "config-data") pod "9388fb80-ae6e-417e-a021-6c34fbd93a5a" (UID: "9388fb80-ae6e-417e-a021-6c34fbd93a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.523553 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9388fb80-ae6e-417e-a021-6c34fbd93a5a" (UID: "9388fb80-ae6e-417e-a021-6c34fbd93a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.588140 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.588555 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.588633 5127 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.588649 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66lz7\" (UniqueName: \"kubernetes.io/projected/9388fb80-ae6e-417e-a021-6c34fbd93a5a-kube-api-access-66lz7\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.588661 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.588675 5127 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9388fb80-ae6e-417e-a021-6c34fbd93a5a-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.741572 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:39:06 crc kubenswrapper[5127]: I0201 08:39:06.741700 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.057039 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-twvf7" event={"ID":"9388fb80-ae6e-417e-a021-6c34fbd93a5a","Type":"ContainerDied","Data":"8d5005586c6b9eba5e4e991cbd50e442248aaba9b67ff8c78ad77aebe2196fe5"} Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.057082 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d5005586c6b9eba5e4e991cbd50e442248aaba9b67ff8c78ad77aebe2196fe5" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.057453 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-twvf7" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.139263 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-twvf7"] Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.146999 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-twvf7"] Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.214665 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tvzdq"] Feb 01 08:39:07 crc kubenswrapper[5127]: E0201 08:39:07.215018 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9388fb80-ae6e-417e-a021-6c34fbd93a5a" containerName="keystone-bootstrap" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.215036 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9388fb80-ae6e-417e-a021-6c34fbd93a5a" containerName="keystone-bootstrap" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.215211 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9388fb80-ae6e-417e-a021-6c34fbd93a5a" containerName="keystone-bootstrap" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.215822 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.218345 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.218496 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.218549 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.218691 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.218719 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vmw2l" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.235935 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tvzdq"] Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.404293 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpz28\" (UniqueName: \"kubernetes.io/projected/66212b37-2c86-4588-badb-15a7e9b260a6-kube-api-access-hpz28\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.404362 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-combined-ca-bundle\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.404480 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-fernet-keys\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.404527 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-credential-keys\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.404643 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-config-data\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.404671 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-scripts\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.505725 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-fernet-keys\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.505811 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-credential-keys\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.505891 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-config-data\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.505923 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-scripts\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.505974 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpz28\" (UniqueName: \"kubernetes.io/projected/66212b37-2c86-4588-badb-15a7e9b260a6-kube-api-access-hpz28\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.506010 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-combined-ca-bundle\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.511541 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-combined-ca-bundle\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.511678 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-fernet-keys\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.512148 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-credential-keys\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.513107 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-scripts\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.515744 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-config-data\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.536698 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpz28\" (UniqueName: \"kubernetes.io/projected/66212b37-2c86-4588-badb-15a7e9b260a6-kube-api-access-hpz28\") pod \"keystone-bootstrap-tvzdq\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:07 crc kubenswrapper[5127]: I0201 08:39:07.546634 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.025996 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tvzdq"] Feb 01 08:39:08 crc kubenswrapper[5127]: W0201 08:39:08.031561 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66212b37_2c86_4588_badb_15a7e9b260a6.slice/crio-4edd2a74544cfcfdd1ee97a055395d6f39258190ff51322bb31200771c5c9804 WatchSource:0}: Error finding container 4edd2a74544cfcfdd1ee97a055395d6f39258190ff51322bb31200771c5c9804: Status 404 returned error can't find the container with id 4edd2a74544cfcfdd1ee97a055395d6f39258190ff51322bb31200771c5c9804 Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.076375 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvzdq" event={"ID":"66212b37-2c86-4588-badb-15a7e9b260a6","Type":"ContainerStarted","Data":"4edd2a74544cfcfdd1ee97a055395d6f39258190ff51322bb31200771c5c9804"} Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.219716 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knggn"] Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.220165 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-knggn" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="registry-server" containerID="cri-o://d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33" gracePeriod=2 Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.248661 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9388fb80-ae6e-417e-a021-6c34fbd93a5a" path="/var/lib/kubelet/pods/9388fb80-ae6e-417e-a021-6c34fbd93a5a/volumes" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.709870 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.840629 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-catalog-content\") pod \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.840690 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-utilities\") pod \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.840844 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz64c\" (UniqueName: \"kubernetes.io/projected/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-kube-api-access-mz64c\") pod \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\" (UID: \"9f7081cc-bc09-48d6-a1d4-57c6be28f75b\") " Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.842407 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-utilities" (OuterVolumeSpecName: "utilities") pod "9f7081cc-bc09-48d6-a1d4-57c6be28f75b" (UID: "9f7081cc-bc09-48d6-a1d4-57c6be28f75b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.845242 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-kube-api-access-mz64c" (OuterVolumeSpecName: "kube-api-access-mz64c") pod "9f7081cc-bc09-48d6-a1d4-57c6be28f75b" (UID: "9f7081cc-bc09-48d6-a1d4-57c6be28f75b"). InnerVolumeSpecName "kube-api-access-mz64c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.897260 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f7081cc-bc09-48d6-a1d4-57c6be28f75b" (UID: "9f7081cc-bc09-48d6-a1d4-57c6be28f75b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.942119 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz64c\" (UniqueName: \"kubernetes.io/projected/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-kube-api-access-mz64c\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.942155 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:08 crc kubenswrapper[5127]: I0201 08:39:08.942166 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f7081cc-bc09-48d6-a1d4-57c6be28f75b-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.086103 5127 generic.go:334] "Generic (PLEG): container finished" podID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerID="d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33" exitCode=0 Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.086213 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-knggn" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.086249 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knggn" event={"ID":"9f7081cc-bc09-48d6-a1d4-57c6be28f75b","Type":"ContainerDied","Data":"d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33"} Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.087060 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-knggn" event={"ID":"9f7081cc-bc09-48d6-a1d4-57c6be28f75b","Type":"ContainerDied","Data":"20fa1d23aa4988c7432df2fa443f821104fa4e09b83ee9009013d79a58ee5866"} Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.087089 5127 scope.go:117] "RemoveContainer" containerID="d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.088989 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvzdq" event={"ID":"66212b37-2c86-4588-badb-15a7e9b260a6","Type":"ContainerStarted","Data":"b178f58c1b7067de0d8c17ee57538e2eede6ca713a22b4f2a0a1a1df1dd80c21"} Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.114492 5127 scope.go:117] "RemoveContainer" containerID="3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.123779 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tvzdq" podStartSLOduration=2.123760513 podStartE2EDuration="2.123760513s" podCreationTimestamp="2026-02-01 08:39:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:39:09.122126078 +0000 UTC m=+6699.608028471" watchObservedRunningTime="2026-02-01 08:39:09.123760513 +0000 UTC m=+6699.609662886" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.144905 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-knggn"] Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.147121 5127 scope.go:117] "RemoveContainer" containerID="1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.153506 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-knggn"] Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.193682 5127 scope.go:117] "RemoveContainer" containerID="d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33" Feb 01 08:39:09 crc kubenswrapper[5127]: E0201 08:39:09.194409 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33\": container with ID starting with d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33 not found: ID does not exist" containerID="d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.194489 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33"} err="failed to get container status \"d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33\": rpc error: code = NotFound desc = could not find container \"d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33\": container with ID starting with d5ba4f2eae2141ee0eef00c16b9def4521b2123439fb9fd499550462047e8d33 not found: ID does not exist" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.194526 5127 scope.go:117] "RemoveContainer" containerID="3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381" Feb 01 08:39:09 crc kubenswrapper[5127]: E0201 08:39:09.195005 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381\": container with ID starting with 3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381 not found: ID does not exist" containerID="3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.195080 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381"} err="failed to get container status \"3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381\": rpc error: code = NotFound desc = could not find container \"3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381\": container with ID starting with 3329a7ceb51d1465b69af2965257fdd2910b539740eae71f379e9b31090d4381 not found: ID does not exist" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.195123 5127 scope.go:117] "RemoveContainer" containerID="1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c" Feb 01 08:39:09 crc kubenswrapper[5127]: E0201 08:39:09.195477 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c\": container with ID starting with 1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c not found: ID does not exist" containerID="1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c" Feb 01 08:39:09 crc kubenswrapper[5127]: I0201 08:39:09.195508 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c"} err="failed to get container status \"1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c\": rpc error: code = NotFound desc = could not find container \"1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c\": container with ID starting with 1c55b5348f85a83115dfeba38519bdaba1581e7c991e3afeea64c6108a4cdb9c not found: ID does not exist" Feb 01 08:39:10 crc kubenswrapper[5127]: I0201 08:39:10.257221 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" path="/var/lib/kubelet/pods/9f7081cc-bc09-48d6-a1d4-57c6be28f75b/volumes" Feb 01 08:39:10 crc kubenswrapper[5127]: I0201 08:39:10.587813 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:39:10 crc kubenswrapper[5127]: I0201 08:39:10.661720 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb"] Feb 01 08:39:10 crc kubenswrapper[5127]: I0201 08:39:10.664325 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" podUID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerName="dnsmasq-dns" containerID="cri-o://f589733d1ebb478bc2ec33963597af99a2bba63f99508fab18182f9c1d179f38" gracePeriod=10 Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.113556 5127 generic.go:334] "Generic (PLEG): container finished" podID="66212b37-2c86-4588-badb-15a7e9b260a6" containerID="b178f58c1b7067de0d8c17ee57538e2eede6ca713a22b4f2a0a1a1df1dd80c21" exitCode=0 Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.113637 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvzdq" event={"ID":"66212b37-2c86-4588-badb-15a7e9b260a6","Type":"ContainerDied","Data":"b178f58c1b7067de0d8c17ee57538e2eede6ca713a22b4f2a0a1a1df1dd80c21"} Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.116058 5127 generic.go:334] "Generic (PLEG): container finished" podID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerID="f589733d1ebb478bc2ec33963597af99a2bba63f99508fab18182f9c1d179f38" exitCode=0 Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.116181 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" event={"ID":"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc","Type":"ContainerDied","Data":"f589733d1ebb478bc2ec33963597af99a2bba63f99508fab18182f9c1d179f38"} Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.471258 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.587361 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-sb\") pod \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.587548 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8bvs\" (UniqueName: \"kubernetes.io/projected/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-kube-api-access-n8bvs\") pod \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.587594 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-dns-svc\") pod \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.587740 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-config\") pod \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.587863 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-nb\") pod \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\" (UID: \"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc\") " Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.596247 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-kube-api-access-n8bvs" (OuterVolumeSpecName: "kube-api-access-n8bvs") pod "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" (UID: "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc"). InnerVolumeSpecName "kube-api-access-n8bvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.631018 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" (UID: "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.651575 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" (UID: "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.655521 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" (UID: "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.657664 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-config" (OuterVolumeSpecName: "config") pod "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" (UID: "4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.690074 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.690132 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.690152 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.690178 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8bvs\" (UniqueName: \"kubernetes.io/projected/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-kube-api-access-n8bvs\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:11 crc kubenswrapper[5127]: I0201 08:39:11.690196 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.130362 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" event={"ID":"4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc","Type":"ContainerDied","Data":"ff46e559dc4ff2d2720ee697c3925fd62c2bfe5fbd48723a130d9fa206ffb900"} Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.130450 5127 scope.go:117] "RemoveContainer" containerID="f589733d1ebb478bc2ec33963597af99a2bba63f99508fab18182f9c1d179f38" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.130384 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.165625 5127 scope.go:117] "RemoveContainer" containerID="fe88ffa8cb730e58d0b8fe55f69445caed58ca3a6ccd6672c803b5f753b79ebc" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.185785 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb"] Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.193070 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfbc4dfdf-2rhjb"] Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.255152 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" path="/var/lib/kubelet/pods/4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc/volumes" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.515070 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.611242 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-scripts\") pod \"66212b37-2c86-4588-badb-15a7e9b260a6\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.611885 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-credential-keys\") pod \"66212b37-2c86-4588-badb-15a7e9b260a6\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.612081 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-combined-ca-bundle\") pod \"66212b37-2c86-4588-badb-15a7e9b260a6\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.612228 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-fernet-keys\") pod \"66212b37-2c86-4588-badb-15a7e9b260a6\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.612381 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-config-data\") pod \"66212b37-2c86-4588-badb-15a7e9b260a6\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.612530 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpz28\" (UniqueName: \"kubernetes.io/projected/66212b37-2c86-4588-badb-15a7e9b260a6-kube-api-access-hpz28\") pod \"66212b37-2c86-4588-badb-15a7e9b260a6\" (UID: \"66212b37-2c86-4588-badb-15a7e9b260a6\") " Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.617413 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-scripts" (OuterVolumeSpecName: "scripts") pod "66212b37-2c86-4588-badb-15a7e9b260a6" (UID: "66212b37-2c86-4588-badb-15a7e9b260a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.617610 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66212b37-2c86-4588-badb-15a7e9b260a6-kube-api-access-hpz28" (OuterVolumeSpecName: "kube-api-access-hpz28") pod "66212b37-2c86-4588-badb-15a7e9b260a6" (UID: "66212b37-2c86-4588-badb-15a7e9b260a6"). InnerVolumeSpecName "kube-api-access-hpz28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.619903 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "66212b37-2c86-4588-badb-15a7e9b260a6" (UID: "66212b37-2c86-4588-badb-15a7e9b260a6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.621469 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "66212b37-2c86-4588-badb-15a7e9b260a6" (UID: "66212b37-2c86-4588-badb-15a7e9b260a6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.644832 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66212b37-2c86-4588-badb-15a7e9b260a6" (UID: "66212b37-2c86-4588-badb-15a7e9b260a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.649730 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-config-data" (OuterVolumeSpecName: "config-data") pod "66212b37-2c86-4588-badb-15a7e9b260a6" (UID: "66212b37-2c86-4588-badb-15a7e9b260a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.714725 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.714778 5127 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.714795 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.714808 5127 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.714823 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66212b37-2c86-4588-badb-15a7e9b260a6-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:12 crc kubenswrapper[5127]: I0201 08:39:12.714836 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpz28\" (UniqueName: \"kubernetes.io/projected/66212b37-2c86-4588-badb-15a7e9b260a6-kube-api-access-hpz28\") on node \"crc\" DevicePath \"\"" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.141081 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tvzdq" event={"ID":"66212b37-2c86-4588-badb-15a7e9b260a6","Type":"ContainerDied","Data":"4edd2a74544cfcfdd1ee97a055395d6f39258190ff51322bb31200771c5c9804"} Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.141135 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4edd2a74544cfcfdd1ee97a055395d6f39258190ff51322bb31200771c5c9804" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.141154 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tvzdq" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.688318 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f4f694774-qc4rf"] Feb 01 08:39:13 crc kubenswrapper[5127]: E0201 08:39:13.688840 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerName="dnsmasq-dns" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.688863 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerName="dnsmasq-dns" Feb 01 08:39:13 crc kubenswrapper[5127]: E0201 08:39:13.688889 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="extract-utilities" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.688903 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="extract-utilities" Feb 01 08:39:13 crc kubenswrapper[5127]: E0201 08:39:13.688924 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66212b37-2c86-4588-badb-15a7e9b260a6" containerName="keystone-bootstrap" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.688936 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="66212b37-2c86-4588-badb-15a7e9b260a6" containerName="keystone-bootstrap" Feb 01 08:39:13 crc kubenswrapper[5127]: E0201 08:39:13.688977 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="registry-server" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.688988 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="registry-server" Feb 01 08:39:13 crc kubenswrapper[5127]: E0201 08:39:13.689005 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerName="init" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.689016 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerName="init" Feb 01 08:39:13 crc kubenswrapper[5127]: E0201 08:39:13.689036 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="extract-content" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.689048 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="extract-content" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.689272 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="66212b37-2c86-4588-badb-15a7e9b260a6" containerName="keystone-bootstrap" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.689301 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f7081cc-bc09-48d6-a1d4-57c6be28f75b" containerName="registry-server" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.689331 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca1baf8-726b-4aed-b2dc-b29e6f7f81bc" containerName="dnsmasq-dns" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.690178 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.695964 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.696318 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.696550 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.696738 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vmw2l" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.714540 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f4f694774-qc4rf"] Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.835650 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-config-data\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.835865 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-combined-ca-bundle\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.835918 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-scripts\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.835960 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nklp9\" (UniqueName: \"kubernetes.io/projected/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-kube-api-access-nklp9\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.836010 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-fernet-keys\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.836070 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-credential-keys\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.937668 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-config-data\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.937752 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-combined-ca-bundle\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.937784 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-scripts\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.937815 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nklp9\" (UniqueName: \"kubernetes.io/projected/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-kube-api-access-nklp9\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.937850 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-fernet-keys\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.937894 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-credential-keys\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.942920 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-config-data\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.944949 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-scripts\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.944946 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-fernet-keys\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.952510 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-combined-ca-bundle\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.953016 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-credential-keys\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:13 crc kubenswrapper[5127]: I0201 08:39:13.962177 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nklp9\" (UniqueName: \"kubernetes.io/projected/26fde84e-4bfc-4181-8287-e3a1d0ccb81a-kube-api-access-nklp9\") pod \"keystone-f4f694774-qc4rf\" (UID: \"26fde84e-4bfc-4181-8287-e3a1d0ccb81a\") " pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:14 crc kubenswrapper[5127]: I0201 08:39:14.007805 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:14 crc kubenswrapper[5127]: I0201 08:39:14.498243 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f4f694774-qc4rf"] Feb 01 08:39:15 crc kubenswrapper[5127]: I0201 08:39:15.156497 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f4f694774-qc4rf" event={"ID":"26fde84e-4bfc-4181-8287-e3a1d0ccb81a","Type":"ContainerStarted","Data":"f2207bcb4e3536a9a4180272fce44e8c5e0f24f093b78e4651bb87c95c23db96"} Feb 01 08:39:15 crc kubenswrapper[5127]: I0201 08:39:15.156552 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f4f694774-qc4rf" event={"ID":"26fde84e-4bfc-4181-8287-e3a1d0ccb81a","Type":"ContainerStarted","Data":"4278d0b8f9039b3d4cfe37f4b20036c8e9281f2158839598ab874806826906a9"} Feb 01 08:39:15 crc kubenswrapper[5127]: I0201 08:39:15.157475 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:15 crc kubenswrapper[5127]: I0201 08:39:15.174617 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f4f694774-qc4rf" podStartSLOduration=2.174599267 podStartE2EDuration="2.174599267s" podCreationTimestamp="2026-02-01 08:39:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:39:15.171325349 +0000 UTC m=+6705.657227712" watchObservedRunningTime="2026-02-01 08:39:15.174599267 +0000 UTC m=+6705.660501630" Feb 01 08:39:36 crc kubenswrapper[5127]: I0201 08:39:36.741403 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:39:36 crc kubenswrapper[5127]: I0201 08:39:36.742755 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:39:36 crc kubenswrapper[5127]: I0201 08:39:36.742870 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:39:36 crc kubenswrapper[5127]: I0201 08:39:36.744039 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:39:36 crc kubenswrapper[5127]: I0201 08:39:36.744121 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" gracePeriod=600 Feb 01 08:39:36 crc kubenswrapper[5127]: E0201 08:39:36.883699 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:39:37 crc kubenswrapper[5127]: I0201 08:39:37.390005 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" exitCode=0 Feb 01 08:39:37 crc kubenswrapper[5127]: I0201 08:39:37.390102 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d"} Feb 01 08:39:37 crc kubenswrapper[5127]: I0201 08:39:37.390626 5127 scope.go:117] "RemoveContainer" containerID="1313965be7bf98e2a15f3d3d8432b5e3945ebbb0fff233b7d904dc276cb91592" Feb 01 08:39:37 crc kubenswrapper[5127]: I0201 08:39:37.391897 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:39:37 crc kubenswrapper[5127]: E0201 08:39:37.392377 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:39:45 crc kubenswrapper[5127]: I0201 08:39:45.522628 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-f4f694774-qc4rf" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.112110 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.115332 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.118379 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-sg6bk" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.119348 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.119732 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.123104 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.126115 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.126439 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config-secret\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.126865 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws27h\" (UniqueName: \"kubernetes.io/projected/de6aab82-98b8-4090-bb46-192a713ee9a8-kube-api-access-ws27h\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.228916 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws27h\" (UniqueName: \"kubernetes.io/projected/de6aab82-98b8-4090-bb46-192a713ee9a8-kube-api-access-ws27h\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.229153 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.229243 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config-secret\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.230970 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.240766 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config-secret\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.255936 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws27h\" (UniqueName: \"kubernetes.io/projected/de6aab82-98b8-4090-bb46-192a713ee9a8-kube-api-access-ws27h\") pod \"openstackclient\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.458035 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 08:39:48 crc kubenswrapper[5127]: I0201 08:39:48.991872 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 08:39:49 crc kubenswrapper[5127]: I0201 08:39:49.514856 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de6aab82-98b8-4090-bb46-192a713ee9a8","Type":"ContainerStarted","Data":"feb8fcec3b3af9638ccaac1e35b0bdce889e2c6c359c21388768ab27ce694f3e"} Feb 01 08:39:53 crc kubenswrapper[5127]: I0201 08:39:53.236109 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:39:53 crc kubenswrapper[5127]: E0201 08:39:53.236690 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:40:00 crc kubenswrapper[5127]: I0201 08:40:00.626524 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"de6aab82-98b8-4090-bb46-192a713ee9a8","Type":"ContainerStarted","Data":"4674861e140aa628f122b3cb74c8778f7bee29b537e33ed1adbe029b312855fa"} Feb 01 08:40:00 crc kubenswrapper[5127]: I0201 08:40:00.666163 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.977565023 podStartE2EDuration="12.666087282s" podCreationTimestamp="2026-02-01 08:39:48 +0000 UTC" firstStartedPulling="2026-02-01 08:39:49.012182159 +0000 UTC m=+6739.498084532" lastFinishedPulling="2026-02-01 08:39:59.700704388 +0000 UTC m=+6750.186606791" observedRunningTime="2026-02-01 08:40:00.648987922 +0000 UTC m=+6751.134890325" watchObservedRunningTime="2026-02-01 08:40:00.666087282 +0000 UTC m=+6751.151989685" Feb 01 08:40:05 crc kubenswrapper[5127]: I0201 08:40:05.236114 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:40:05 crc kubenswrapper[5127]: E0201 08:40:05.236992 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:40:17 crc kubenswrapper[5127]: I0201 08:40:17.236631 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:40:17 crc kubenswrapper[5127]: E0201 08:40:17.237639 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:40:28 crc kubenswrapper[5127]: I0201 08:40:28.235756 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:40:28 crc kubenswrapper[5127]: E0201 08:40:28.236948 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:40:43 crc kubenswrapper[5127]: I0201 08:40:43.236252 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:40:43 crc kubenswrapper[5127]: E0201 08:40:43.236909 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:40:57 crc kubenswrapper[5127]: I0201 08:40:57.236468 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:40:57 crc kubenswrapper[5127]: E0201 08:40:57.237758 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:41:09 crc kubenswrapper[5127]: I0201 08:41:09.236375 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:41:09 crc kubenswrapper[5127]: E0201 08:41:09.251000 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:41:21 crc kubenswrapper[5127]: I0201 08:41:21.235218 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:41:21 crc kubenswrapper[5127]: E0201 08:41:21.236086 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.207166 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-w7x6q"] Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.208970 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.257037 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w7x6q"] Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.303343 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-60e9-account-create-update-jjsf4"] Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.304548 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.307031 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.320593 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-60e9-account-create-update-jjsf4"] Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.370111 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2e5f2be-7990-4dc6-920e-3dcdf7247424-operator-scripts\") pod \"barbican-db-create-w7x6q\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.370204 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj44g\" (UniqueName: \"kubernetes.io/projected/9289cd0d-84e3-4e64-b58b-abb7cd491d78-kube-api-access-fj44g\") pod \"barbican-60e9-account-create-update-jjsf4\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.370518 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9289cd0d-84e3-4e64-b58b-abb7cd491d78-operator-scripts\") pod \"barbican-60e9-account-create-update-jjsf4\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.370645 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsh9w\" (UniqueName: \"kubernetes.io/projected/f2e5f2be-7990-4dc6-920e-3dcdf7247424-kube-api-access-gsh9w\") pod \"barbican-db-create-w7x6q\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.472550 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2e5f2be-7990-4dc6-920e-3dcdf7247424-operator-scripts\") pod \"barbican-db-create-w7x6q\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.472660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj44g\" (UniqueName: \"kubernetes.io/projected/9289cd0d-84e3-4e64-b58b-abb7cd491d78-kube-api-access-fj44g\") pod \"barbican-60e9-account-create-update-jjsf4\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.472806 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9289cd0d-84e3-4e64-b58b-abb7cd491d78-operator-scripts\") pod \"barbican-60e9-account-create-update-jjsf4\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.472851 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsh9w\" (UniqueName: \"kubernetes.io/projected/f2e5f2be-7990-4dc6-920e-3dcdf7247424-kube-api-access-gsh9w\") pod \"barbican-db-create-w7x6q\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.473680 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2e5f2be-7990-4dc6-920e-3dcdf7247424-operator-scripts\") pod \"barbican-db-create-w7x6q\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.474284 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9289cd0d-84e3-4e64-b58b-abb7cd491d78-operator-scripts\") pod \"barbican-60e9-account-create-update-jjsf4\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.492110 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsh9w\" (UniqueName: \"kubernetes.io/projected/f2e5f2be-7990-4dc6-920e-3dcdf7247424-kube-api-access-gsh9w\") pod \"barbican-db-create-w7x6q\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.495700 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj44g\" (UniqueName: \"kubernetes.io/projected/9289cd0d-84e3-4e64-b58b-abb7cd491d78-kube-api-access-fj44g\") pod \"barbican-60e9-account-create-update-jjsf4\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.965168 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:29 crc kubenswrapper[5127]: I0201 08:41:29.968303 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.499544 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-60e9-account-create-update-jjsf4"] Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.500596 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 01 08:41:30 crc kubenswrapper[5127]: W0201 08:41:30.639903 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2e5f2be_7990_4dc6_920e_3dcdf7247424.slice/crio-234ac1aa2dafaab9e3bc82f17b300c55a03a2f656834b1220669d236e9fd09ed WatchSource:0}: Error finding container 234ac1aa2dafaab9e3bc82f17b300c55a03a2f656834b1220669d236e9fd09ed: Status 404 returned error can't find the container with id 234ac1aa2dafaab9e3bc82f17b300c55a03a2f656834b1220669d236e9fd09ed Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.641919 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-w7x6q"] Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.982063 5127 generic.go:334] "Generic (PLEG): container finished" podID="f2e5f2be-7990-4dc6-920e-3dcdf7247424" containerID="88a134c2b0427d1c9fae08dff4980472d8a5a5771e178bb51d03b8ca606c2efd" exitCode=0 Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.982545 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w7x6q" event={"ID":"f2e5f2be-7990-4dc6-920e-3dcdf7247424","Type":"ContainerDied","Data":"88a134c2b0427d1c9fae08dff4980472d8a5a5771e178bb51d03b8ca606c2efd"} Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.982789 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w7x6q" event={"ID":"f2e5f2be-7990-4dc6-920e-3dcdf7247424","Type":"ContainerStarted","Data":"234ac1aa2dafaab9e3bc82f17b300c55a03a2f656834b1220669d236e9fd09ed"} Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.984430 5127 generic.go:334] "Generic (PLEG): container finished" podID="9289cd0d-84e3-4e64-b58b-abb7cd491d78" containerID="d4a2e6e80a65fef990ee04fb2cccb3997b4713bf2233a615d3dfdc78de7479c9" exitCode=0 Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.984503 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-60e9-account-create-update-jjsf4" event={"ID":"9289cd0d-84e3-4e64-b58b-abb7cd491d78","Type":"ContainerDied","Data":"d4a2e6e80a65fef990ee04fb2cccb3997b4713bf2233a615d3dfdc78de7479c9"} Feb 01 08:41:30 crc kubenswrapper[5127]: I0201 08:41:30.984595 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-60e9-account-create-update-jjsf4" event={"ID":"9289cd0d-84e3-4e64-b58b-abb7cd491d78","Type":"ContainerStarted","Data":"a6bfa1919b34adcaf9acf428110548764ccd43226480333cfa8e5ac5eeb390e8"} Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.450399 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.456683 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.512150 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2e5f2be-7990-4dc6-920e-3dcdf7247424-operator-scripts\") pod \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.512213 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj44g\" (UniqueName: \"kubernetes.io/projected/9289cd0d-84e3-4e64-b58b-abb7cd491d78-kube-api-access-fj44g\") pod \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.512279 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9289cd0d-84e3-4e64-b58b-abb7cd491d78-operator-scripts\") pod \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\" (UID: \"9289cd0d-84e3-4e64-b58b-abb7cd491d78\") " Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.513105 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9289cd0d-84e3-4e64-b58b-abb7cd491d78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9289cd0d-84e3-4e64-b58b-abb7cd491d78" (UID: "9289cd0d-84e3-4e64-b58b-abb7cd491d78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.513176 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsh9w\" (UniqueName: \"kubernetes.io/projected/f2e5f2be-7990-4dc6-920e-3dcdf7247424-kube-api-access-gsh9w\") pod \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\" (UID: \"f2e5f2be-7990-4dc6-920e-3dcdf7247424\") " Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.513139 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2e5f2be-7990-4dc6-920e-3dcdf7247424-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2e5f2be-7990-4dc6-920e-3dcdf7247424" (UID: "f2e5f2be-7990-4dc6-920e-3dcdf7247424"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.514290 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2e5f2be-7990-4dc6-920e-3dcdf7247424-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.514328 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9289cd0d-84e3-4e64-b58b-abb7cd491d78-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.519811 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9289cd0d-84e3-4e64-b58b-abb7cd491d78-kube-api-access-fj44g" (OuterVolumeSpecName: "kube-api-access-fj44g") pod "9289cd0d-84e3-4e64-b58b-abb7cd491d78" (UID: "9289cd0d-84e3-4e64-b58b-abb7cd491d78"). InnerVolumeSpecName "kube-api-access-fj44g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.520608 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2e5f2be-7990-4dc6-920e-3dcdf7247424-kube-api-access-gsh9w" (OuterVolumeSpecName: "kube-api-access-gsh9w") pod "f2e5f2be-7990-4dc6-920e-3dcdf7247424" (UID: "f2e5f2be-7990-4dc6-920e-3dcdf7247424"). InnerVolumeSpecName "kube-api-access-gsh9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.615328 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsh9w\" (UniqueName: \"kubernetes.io/projected/f2e5f2be-7990-4dc6-920e-3dcdf7247424-kube-api-access-gsh9w\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:32 crc kubenswrapper[5127]: I0201 08:41:32.615365 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj44g\" (UniqueName: \"kubernetes.io/projected/9289cd0d-84e3-4e64-b58b-abb7cd491d78-kube-api-access-fj44g\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:33 crc kubenswrapper[5127]: I0201 08:41:33.001984 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-60e9-account-create-update-jjsf4" Feb 01 08:41:33 crc kubenswrapper[5127]: I0201 08:41:33.002182 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-60e9-account-create-update-jjsf4" event={"ID":"9289cd0d-84e3-4e64-b58b-abb7cd491d78","Type":"ContainerDied","Data":"a6bfa1919b34adcaf9acf428110548764ccd43226480333cfa8e5ac5eeb390e8"} Feb 01 08:41:33 crc kubenswrapper[5127]: I0201 08:41:33.002492 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6bfa1919b34adcaf9acf428110548764ccd43226480333cfa8e5ac5eeb390e8" Feb 01 08:41:33 crc kubenswrapper[5127]: I0201 08:41:33.003991 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-w7x6q" event={"ID":"f2e5f2be-7990-4dc6-920e-3dcdf7247424","Type":"ContainerDied","Data":"234ac1aa2dafaab9e3bc82f17b300c55a03a2f656834b1220669d236e9fd09ed"} Feb 01 08:41:33 crc kubenswrapper[5127]: I0201 08:41:33.004028 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="234ac1aa2dafaab9e3bc82f17b300c55a03a2f656834b1220669d236e9fd09ed" Feb 01 08:41:33 crc kubenswrapper[5127]: I0201 08:41:33.004087 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-w7x6q" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.630253 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xvm96"] Feb 01 08:41:34 crc kubenswrapper[5127]: E0201 08:41:34.630987 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9289cd0d-84e3-4e64-b58b-abb7cd491d78" containerName="mariadb-account-create-update" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.631022 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9289cd0d-84e3-4e64-b58b-abb7cd491d78" containerName="mariadb-account-create-update" Feb 01 08:41:34 crc kubenswrapper[5127]: E0201 08:41:34.631079 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2e5f2be-7990-4dc6-920e-3dcdf7247424" containerName="mariadb-database-create" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.631097 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2e5f2be-7990-4dc6-920e-3dcdf7247424" containerName="mariadb-database-create" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.631477 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9289cd0d-84e3-4e64-b58b-abb7cd491d78" containerName="mariadb-account-create-update" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.631525 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2e5f2be-7990-4dc6-920e-3dcdf7247424" containerName="mariadb-database-create" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.632753 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.636966 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-djm9z" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.637815 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.641728 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xvm96"] Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.755269 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-db-sync-config-data\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.755314 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b84n\" (UniqueName: \"kubernetes.io/projected/8d99c6eb-df9a-4205-9674-695a49e6c720-kube-api-access-2b84n\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.755357 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-combined-ca-bundle\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.857074 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-db-sync-config-data\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.857116 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b84n\" (UniqueName: \"kubernetes.io/projected/8d99c6eb-df9a-4205-9674-695a49e6c720-kube-api-access-2b84n\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.857165 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-combined-ca-bundle\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.863685 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-combined-ca-bundle\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.876393 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-db-sync-config-data\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.881158 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b84n\" (UniqueName: \"kubernetes.io/projected/8d99c6eb-df9a-4205-9674-695a49e6c720-kube-api-access-2b84n\") pod \"barbican-db-sync-xvm96\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:34 crc kubenswrapper[5127]: I0201 08:41:34.956076 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:35 crc kubenswrapper[5127]: I0201 08:41:35.235608 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:41:35 crc kubenswrapper[5127]: E0201 08:41:35.236040 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:41:35 crc kubenswrapper[5127]: I0201 08:41:35.443526 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xvm96"] Feb 01 08:41:36 crc kubenswrapper[5127]: I0201 08:41:36.033015 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xvm96" event={"ID":"8d99c6eb-df9a-4205-9674-695a49e6c720","Type":"ContainerStarted","Data":"77f709993b08dea0e6a80084fa223963cef736d82709e5a486a07318758fda5a"} Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.566095 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jwbks"] Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.570005 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.622790 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-catalog-content\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.622889 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-utilities\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.624716 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgm9\" (UniqueName: \"kubernetes.io/projected/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-kube-api-access-clgm9\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.627217 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwbks"] Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.727937 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clgm9\" (UniqueName: \"kubernetes.io/projected/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-kube-api-access-clgm9\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.728062 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-catalog-content\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.728120 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-utilities\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.728905 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-catalog-content\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.729031 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-utilities\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.757411 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgm9\" (UniqueName: \"kubernetes.io/projected/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-kube-api-access-clgm9\") pod \"redhat-marketplace-jwbks\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:40 crc kubenswrapper[5127]: I0201 08:41:40.926817 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:41 crc kubenswrapper[5127]: I0201 08:41:41.483809 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwbks"] Feb 01 08:41:41 crc kubenswrapper[5127]: W0201 08:41:41.495490 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ea4aa4_b3ed_418d_a0cf_a594c6534011.slice/crio-adadf42f75a084808e1dab285f8fcc98eb5e5e3749a19a70095a86a59fc055f7 WatchSource:0}: Error finding container adadf42f75a084808e1dab285f8fcc98eb5e5e3749a19a70095a86a59fc055f7: Status 404 returned error can't find the container with id adadf42f75a084808e1dab285f8fcc98eb5e5e3749a19a70095a86a59fc055f7 Feb 01 08:41:42 crc kubenswrapper[5127]: I0201 08:41:42.135793 5127 generic.go:334] "Generic (PLEG): container finished" podID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerID="279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db" exitCode=0 Feb 01 08:41:42 crc kubenswrapper[5127]: I0201 08:41:42.136091 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwbks" event={"ID":"f1ea4aa4-b3ed-418d-a0cf-a594c6534011","Type":"ContainerDied","Data":"279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db"} Feb 01 08:41:42 crc kubenswrapper[5127]: I0201 08:41:42.136126 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwbks" event={"ID":"f1ea4aa4-b3ed-418d-a0cf-a594c6534011","Type":"ContainerStarted","Data":"adadf42f75a084808e1dab285f8fcc98eb5e5e3749a19a70095a86a59fc055f7"} Feb 01 08:41:42 crc kubenswrapper[5127]: I0201 08:41:42.138475 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xvm96" event={"ID":"8d99c6eb-df9a-4205-9674-695a49e6c720","Type":"ContainerStarted","Data":"4f9dfa6b6acfa850ec7a5506c38ba79e71f448b610e2911ca672c7314de9c33b"} Feb 01 08:41:42 crc kubenswrapper[5127]: I0201 08:41:42.216924 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xvm96" podStartSLOduration=2.744255816 podStartE2EDuration="8.216893604s" podCreationTimestamp="2026-02-01 08:41:34 +0000 UTC" firstStartedPulling="2026-02-01 08:41:35.467786343 +0000 UTC m=+6845.953688696" lastFinishedPulling="2026-02-01 08:41:40.940424091 +0000 UTC m=+6851.426326484" observedRunningTime="2026-02-01 08:41:42.207481571 +0000 UTC m=+6852.693383924" watchObservedRunningTime="2026-02-01 08:41:42.216893604 +0000 UTC m=+6852.702795997" Feb 01 08:41:44 crc kubenswrapper[5127]: I0201 08:41:44.168117 5127 generic.go:334] "Generic (PLEG): container finished" podID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerID="4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a" exitCode=0 Feb 01 08:41:44 crc kubenswrapper[5127]: I0201 08:41:44.168716 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwbks" event={"ID":"f1ea4aa4-b3ed-418d-a0cf-a594c6534011","Type":"ContainerDied","Data":"4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a"} Feb 01 08:41:44 crc kubenswrapper[5127]: I0201 08:41:44.172611 5127 generic.go:334] "Generic (PLEG): container finished" podID="8d99c6eb-df9a-4205-9674-695a49e6c720" containerID="4f9dfa6b6acfa850ec7a5506c38ba79e71f448b610e2911ca672c7314de9c33b" exitCode=0 Feb 01 08:41:44 crc kubenswrapper[5127]: I0201 08:41:44.172631 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xvm96" event={"ID":"8d99c6eb-df9a-4205-9674-695a49e6c720","Type":"ContainerDied","Data":"4f9dfa6b6acfa850ec7a5506c38ba79e71f448b610e2911ca672c7314de9c33b"} Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.183540 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwbks" event={"ID":"f1ea4aa4-b3ed-418d-a0cf-a594c6534011","Type":"ContainerStarted","Data":"fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c"} Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.203168 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jwbks" podStartSLOduration=2.772737908 podStartE2EDuration="5.203147527s" podCreationTimestamp="2026-02-01 08:41:40 +0000 UTC" firstStartedPulling="2026-02-01 08:41:42.138623398 +0000 UTC m=+6852.624525761" lastFinishedPulling="2026-02-01 08:41:44.569033007 +0000 UTC m=+6855.054935380" observedRunningTime="2026-02-01 08:41:45.200444024 +0000 UTC m=+6855.686346397" watchObservedRunningTime="2026-02-01 08:41:45.203147527 +0000 UTC m=+6855.689049890" Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.519834 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.623096 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-db-sync-config-data\") pod \"8d99c6eb-df9a-4205-9674-695a49e6c720\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.623359 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b84n\" (UniqueName: \"kubernetes.io/projected/8d99c6eb-df9a-4205-9674-695a49e6c720-kube-api-access-2b84n\") pod \"8d99c6eb-df9a-4205-9674-695a49e6c720\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.623486 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-combined-ca-bundle\") pod \"8d99c6eb-df9a-4205-9674-695a49e6c720\" (UID: \"8d99c6eb-df9a-4205-9674-695a49e6c720\") " Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.629934 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d99c6eb-df9a-4205-9674-695a49e6c720-kube-api-access-2b84n" (OuterVolumeSpecName: "kube-api-access-2b84n") pod "8d99c6eb-df9a-4205-9674-695a49e6c720" (UID: "8d99c6eb-df9a-4205-9674-695a49e6c720"). InnerVolumeSpecName "kube-api-access-2b84n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.630338 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8d99c6eb-df9a-4205-9674-695a49e6c720" (UID: "8d99c6eb-df9a-4205-9674-695a49e6c720"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.654798 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d99c6eb-df9a-4205-9674-695a49e6c720" (UID: "8d99c6eb-df9a-4205-9674-695a49e6c720"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.725066 5127 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.725137 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b84n\" (UniqueName: \"kubernetes.io/projected/8d99c6eb-df9a-4205-9674-695a49e6c720-kube-api-access-2b84n\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:45 crc kubenswrapper[5127]: I0201 08:41:45.725150 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d99c6eb-df9a-4205-9674-695a49e6c720-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.194075 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xvm96" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.194086 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xvm96" event={"ID":"8d99c6eb-df9a-4205-9674-695a49e6c720","Type":"ContainerDied","Data":"77f709993b08dea0e6a80084fa223963cef736d82709e5a486a07318758fda5a"} Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.194608 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77f709993b08dea0e6a80084fa223963cef736d82709e5a486a07318758fda5a" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.486937 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6fc8f58b97-kw7wk"] Feb 01 08:41:46 crc kubenswrapper[5127]: E0201 08:41:46.487355 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d99c6eb-df9a-4205-9674-695a49e6c720" containerName="barbican-db-sync" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.487400 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d99c6eb-df9a-4205-9674-695a49e6c720" containerName="barbican-db-sync" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.487648 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d99c6eb-df9a-4205-9674-695a49e6c720" containerName="barbican-db-sync" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.488565 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.494537 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.494755 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.495119 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-djm9z" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.516409 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6fc8f58b97-kw7wk"] Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.539887 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5b2030b-064a-4a95-aa42-7acffae51598-logs\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.540012 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-combined-ca-bundle\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.540060 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-config-data\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.540183 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szxd2\" (UniqueName: \"kubernetes.io/projected/c5b2030b-064a-4a95-aa42-7acffae51598-kube-api-access-szxd2\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.540248 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-config-data-custom\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.587273 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b677fd8d7-bbt5x"] Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.589101 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.615537 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b677fd8d7-bbt5x"] Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643022 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-config\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643072 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-config-data\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643115 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb9w7\" (UniqueName: \"kubernetes.io/projected/2b71952a-c8b5-4778-8834-ffe4aa043fe1-kube-api-access-mb9w7\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643147 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-sb\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643194 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szxd2\" (UniqueName: \"kubernetes.io/projected/c5b2030b-064a-4a95-aa42-7acffae51598-kube-api-access-szxd2\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643235 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-nb\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643255 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-config-data-custom\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643291 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5b2030b-064a-4a95-aa42-7acffae51598-logs\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643308 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-dns-svc\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.643342 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-combined-ca-bundle\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.644327 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5b2030b-064a-4a95-aa42-7acffae51598-logs\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.651224 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-combined-ca-bundle\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.654474 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-config-data-custom\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.656058 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b2030b-064a-4a95-aa42-7acffae51598-config-data\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.671776 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-df958c9bb-zn2lr"] Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.673065 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.675439 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.682061 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szxd2\" (UniqueName: \"kubernetes.io/projected/c5b2030b-064a-4a95-aa42-7acffae51598-kube-api-access-szxd2\") pod \"barbican-worker-6fc8f58b97-kw7wk\" (UID: \"c5b2030b-064a-4a95-aa42-7acffae51598\") " pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.694886 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-df958c9bb-zn2lr"] Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.745567 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-dns-svc\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.745813 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-config\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.745873 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-logs\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.745919 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb9w7\" (UniqueName: \"kubernetes.io/projected/2b71952a-c8b5-4778-8834-ffe4aa043fe1-kube-api-access-mb9w7\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.745965 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-sb\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.746026 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-combined-ca-bundle\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.746054 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-config-data\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.746117 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs7v8\" (UniqueName: \"kubernetes.io/projected/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-kube-api-access-hs7v8\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.746157 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-config-data-custom\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.746188 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-nb\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.747530 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-nb\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.748365 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-dns-svc\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.748499 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-config\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.748668 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-sb\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.758068 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58b7b9b55b-6s7jl"] Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.760334 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.764216 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.780630 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58b7b9b55b-6s7jl"] Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.783928 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb9w7\" (UniqueName: \"kubernetes.io/projected/2b71952a-c8b5-4778-8834-ffe4aa043fe1-kube-api-access-mb9w7\") pod \"dnsmasq-dns-b677fd8d7-bbt5x\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.822996 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6fc8f58b97-kw7wk" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.848794 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-combined-ca-bundle\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.849237 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sncc\" (UniqueName: \"kubernetes.io/projected/61db7abe-1b12-4dae-884e-86399d0c5f63-kube-api-access-6sncc\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.849403 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-config-data\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.849538 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-config-data\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.849676 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-config-data-custom\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.849801 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs7v8\" (UniqueName: \"kubernetes.io/projected/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-kube-api-access-hs7v8\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.849937 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-config-data-custom\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.850116 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db7abe-1b12-4dae-884e-86399d0c5f63-logs\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.850262 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-combined-ca-bundle\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.850397 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-logs\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.851641 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-logs\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.857195 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-combined-ca-bundle\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.863291 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-config-data\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.863956 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-config-data-custom\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.870461 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs7v8\" (UniqueName: \"kubernetes.io/projected/77fe9270-c3c0-4eb4-b8fd-d88d1ab06756-kube-api-access-hs7v8\") pod \"barbican-keystone-listener-df958c9bb-zn2lr\" (UID: \"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756\") " pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.917052 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.952943 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-combined-ca-bundle\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.953440 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sncc\" (UniqueName: \"kubernetes.io/projected/61db7abe-1b12-4dae-884e-86399d0c5f63-kube-api-access-6sncc\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.955310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-config-data\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.955439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-config-data-custom\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.957007 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db7abe-1b12-4dae-884e-86399d0c5f63-logs\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.957696 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61db7abe-1b12-4dae-884e-86399d0c5f63-logs\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.958302 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-combined-ca-bundle\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.959537 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-config-data\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.960960 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/61db7abe-1b12-4dae-884e-86399d0c5f63-config-data-custom\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:46 crc kubenswrapper[5127]: I0201 08:41:46.973373 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sncc\" (UniqueName: \"kubernetes.io/projected/61db7abe-1b12-4dae-884e-86399d0c5f63-kube-api-access-6sncc\") pod \"barbican-api-58b7b9b55b-6s7jl\" (UID: \"61db7abe-1b12-4dae-884e-86399d0c5f63\") " pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:47 crc kubenswrapper[5127]: I0201 08:41:47.047046 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" Feb 01 08:41:47 crc kubenswrapper[5127]: I0201 08:41:47.084971 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:47 crc kubenswrapper[5127]: I0201 08:41:47.308619 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6fc8f58b97-kw7wk"] Feb 01 08:41:47 crc kubenswrapper[5127]: I0201 08:41:47.556675 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-df958c9bb-zn2lr"] Feb 01 08:41:47 crc kubenswrapper[5127]: W0201 08:41:47.570991 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77fe9270_c3c0_4eb4_b8fd_d88d1ab06756.slice/crio-cf0280ff36f1c9fe7c0dc673481cc563cf47bbece20549db055da5e154a0da91 WatchSource:0}: Error finding container cf0280ff36f1c9fe7c0dc673481cc563cf47bbece20549db055da5e154a0da91: Status 404 returned error can't find the container with id cf0280ff36f1c9fe7c0dc673481cc563cf47bbece20549db055da5e154a0da91 Feb 01 08:41:47 crc kubenswrapper[5127]: I0201 08:41:47.608853 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b677fd8d7-bbt5x"] Feb 01 08:41:47 crc kubenswrapper[5127]: I0201 08:41:47.855637 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58b7b9b55b-6s7jl"] Feb 01 08:41:47 crc kubenswrapper[5127]: W0201 08:41:47.867510 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61db7abe_1b12_4dae_884e_86399d0c5f63.slice/crio-d5b85762fcaf631ff5eabec2e01dc8368e2f81e622aa77be860eb3c1513a2e03 WatchSource:0}: Error finding container d5b85762fcaf631ff5eabec2e01dc8368e2f81e622aa77be860eb3c1513a2e03: Status 404 returned error can't find the container with id d5b85762fcaf631ff5eabec2e01dc8368e2f81e622aa77be860eb3c1513a2e03 Feb 01 08:41:48 crc kubenswrapper[5127]: I0201 08:41:48.239646 5127 generic.go:334] "Generic (PLEG): container finished" podID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerID="2354e9a04c2a2db93de3b55b31a372c3f79b263fcce27f165097674487b4763b" exitCode=0 Feb 01 08:41:48 crc kubenswrapper[5127]: I0201 08:41:48.246051 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fc8f58b97-kw7wk" event={"ID":"c5b2030b-064a-4a95-aa42-7acffae51598","Type":"ContainerStarted","Data":"14bd91a4a9df71931070d7281ecee09a29fc9da4d7079eb25262bbb730051363"} Feb 01 08:41:48 crc kubenswrapper[5127]: I0201 08:41:48.246099 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" event={"ID":"2b71952a-c8b5-4778-8834-ffe4aa043fe1","Type":"ContainerDied","Data":"2354e9a04c2a2db93de3b55b31a372c3f79b263fcce27f165097674487b4763b"} Feb 01 08:41:48 crc kubenswrapper[5127]: I0201 08:41:48.246117 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" event={"ID":"2b71952a-c8b5-4778-8834-ffe4aa043fe1","Type":"ContainerStarted","Data":"278d388b7eb0df72b8244c4a6cf23515b0ba206504a264552bcacbba92748c8b"} Feb 01 08:41:48 crc kubenswrapper[5127]: I0201 08:41:48.246131 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58b7b9b55b-6s7jl" event={"ID":"61db7abe-1b12-4dae-884e-86399d0c5f63","Type":"ContainerStarted","Data":"ffb574f16dfa85901f097d670f36de89454ff0777af33cfdb5e87eac18869b09"} Feb 01 08:41:48 crc kubenswrapper[5127]: I0201 08:41:48.246146 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58b7b9b55b-6s7jl" event={"ID":"61db7abe-1b12-4dae-884e-86399d0c5f63","Type":"ContainerStarted","Data":"d5b85762fcaf631ff5eabec2e01dc8368e2f81e622aa77be860eb3c1513a2e03"} Feb 01 08:41:48 crc kubenswrapper[5127]: I0201 08:41:48.246158 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" event={"ID":"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756","Type":"ContainerStarted","Data":"cf0280ff36f1c9fe7c0dc673481cc563cf47bbece20549db055da5e154a0da91"} Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.236896 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:41:49 crc kubenswrapper[5127]: E0201 08:41:49.237678 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.258100 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fc8f58b97-kw7wk" event={"ID":"c5b2030b-064a-4a95-aa42-7acffae51598","Type":"ContainerStarted","Data":"ba81fe93f93dd5ef964ca67fb2477c3708e72726569f2184a88b7a50587f9af4"} Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.258174 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6fc8f58b97-kw7wk" event={"ID":"c5b2030b-064a-4a95-aa42-7acffae51598","Type":"ContainerStarted","Data":"713bbbdf47e54c3ebdbf375e730ba46e055c697ec2c9e4d7926307ad80194d84"} Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.266862 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" event={"ID":"2b71952a-c8b5-4778-8834-ffe4aa043fe1","Type":"ContainerStarted","Data":"6f6be3da71040047c9bc7a7f5ef8830e81609bb20b30f227ea15967c34d4eb5b"} Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.267034 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.271038 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58b7b9b55b-6s7jl" event={"ID":"61db7abe-1b12-4dae-884e-86399d0c5f63","Type":"ContainerStarted","Data":"958c167b1803a01801cc0e554703c63df182ac6f3f4d31b8885ff3b0c82a31aa"} Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.272662 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.272799 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.278253 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" event={"ID":"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756","Type":"ContainerStarted","Data":"03af17dea7c097962ee31e5f91c927478dee371790152b417577220d65f52708"} Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.284483 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6fc8f58b97-kw7wk" podStartSLOduration=1.941342627 podStartE2EDuration="3.284461373s" podCreationTimestamp="2026-02-01 08:41:46 +0000 UTC" firstStartedPulling="2026-02-01 08:41:47.341088667 +0000 UTC m=+6857.826991030" lastFinishedPulling="2026-02-01 08:41:48.684207413 +0000 UTC m=+6859.170109776" observedRunningTime="2026-02-01 08:41:49.277892376 +0000 UTC m=+6859.763794749" watchObservedRunningTime="2026-02-01 08:41:49.284461373 +0000 UTC m=+6859.770363746" Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.321873 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" podStartSLOduration=3.321851078 podStartE2EDuration="3.321851078s" podCreationTimestamp="2026-02-01 08:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:41:49.297167145 +0000 UTC m=+6859.783069508" watchObservedRunningTime="2026-02-01 08:41:49.321851078 +0000 UTC m=+6859.807753441" Feb 01 08:41:49 crc kubenswrapper[5127]: I0201 08:41:49.337410 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58b7b9b55b-6s7jl" podStartSLOduration=3.337387497 podStartE2EDuration="3.337387497s" podCreationTimestamp="2026-02-01 08:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:41:49.32413473 +0000 UTC m=+6859.810037123" watchObservedRunningTime="2026-02-01 08:41:49.337387497 +0000 UTC m=+6859.823289860" Feb 01 08:41:50 crc kubenswrapper[5127]: I0201 08:41:50.294847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" event={"ID":"77fe9270-c3c0-4eb4-b8fd-d88d1ab06756","Type":"ContainerStarted","Data":"3bd3f89273eb67422ab95ea01f19a8f1b6405a69ecc642a9b75956550af9a83c"} Feb 01 08:41:50 crc kubenswrapper[5127]: I0201 08:41:50.927791 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:50 crc kubenswrapper[5127]: I0201 08:41:50.928317 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:51 crc kubenswrapper[5127]: I0201 08:41:51.007487 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:51 crc kubenswrapper[5127]: I0201 08:41:51.041465 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-df958c9bb-zn2lr" podStartSLOduration=3.932236451 podStartE2EDuration="5.041434913s" podCreationTimestamp="2026-02-01 08:41:46 +0000 UTC" firstStartedPulling="2026-02-01 08:41:47.574702613 +0000 UTC m=+6858.060604976" lastFinishedPulling="2026-02-01 08:41:48.683901075 +0000 UTC m=+6859.169803438" observedRunningTime="2026-02-01 08:41:50.322805489 +0000 UTC m=+6860.808707852" watchObservedRunningTime="2026-02-01 08:41:51.041434913 +0000 UTC m=+6861.527337306" Feb 01 08:41:51 crc kubenswrapper[5127]: I0201 08:41:51.366972 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:51 crc kubenswrapper[5127]: I0201 08:41:51.450542 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwbks"] Feb 01 08:41:53 crc kubenswrapper[5127]: I0201 08:41:53.322687 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jwbks" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="registry-server" containerID="cri-o://fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c" gracePeriod=2 Feb 01 08:41:53 crc kubenswrapper[5127]: I0201 08:41:53.822362 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:53 crc kubenswrapper[5127]: I0201 08:41:53.954754 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-utilities\") pod \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " Feb 01 08:41:53 crc kubenswrapper[5127]: I0201 08:41:53.954816 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-catalog-content\") pod \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " Feb 01 08:41:53 crc kubenswrapper[5127]: I0201 08:41:53.954922 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clgm9\" (UniqueName: \"kubernetes.io/projected/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-kube-api-access-clgm9\") pod \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\" (UID: \"f1ea4aa4-b3ed-418d-a0cf-a594c6534011\") " Feb 01 08:41:53 crc kubenswrapper[5127]: I0201 08:41:53.956518 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-utilities" (OuterVolumeSpecName: "utilities") pod "f1ea4aa4-b3ed-418d-a0cf-a594c6534011" (UID: "f1ea4aa4-b3ed-418d-a0cf-a594c6534011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:41:53 crc kubenswrapper[5127]: I0201 08:41:53.964217 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-kube-api-access-clgm9" (OuterVolumeSpecName: "kube-api-access-clgm9") pod "f1ea4aa4-b3ed-418d-a0cf-a594c6534011" (UID: "f1ea4aa4-b3ed-418d-a0cf-a594c6534011"). InnerVolumeSpecName "kube-api-access-clgm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.030742 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1ea4aa4-b3ed-418d-a0cf-a594c6534011" (UID: "f1ea4aa4-b3ed-418d-a0cf-a594c6534011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.057506 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.057540 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.057550 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clgm9\" (UniqueName: \"kubernetes.io/projected/f1ea4aa4-b3ed-418d-a0cf-a594c6534011-kube-api-access-clgm9\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.334256 5127 generic.go:334] "Generic (PLEG): container finished" podID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerID="fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c" exitCode=0 Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.334620 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwbks" event={"ID":"f1ea4aa4-b3ed-418d-a0cf-a594c6534011","Type":"ContainerDied","Data":"fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c"} Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.335698 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jwbks" event={"ID":"f1ea4aa4-b3ed-418d-a0cf-a594c6534011","Type":"ContainerDied","Data":"adadf42f75a084808e1dab285f8fcc98eb5e5e3749a19a70095a86a59fc055f7"} Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.334956 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jwbks" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.335885 5127 scope.go:117] "RemoveContainer" containerID="fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.386752 5127 scope.go:117] "RemoveContainer" containerID="4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.389317 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwbks"] Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.443951 5127 scope.go:117] "RemoveContainer" containerID="279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.498718 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jwbks"] Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.565611 5127 scope.go:117] "RemoveContainer" containerID="fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c" Feb 01 08:41:54 crc kubenswrapper[5127]: E0201 08:41:54.573784 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c\": container with ID starting with fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c not found: ID does not exist" containerID="fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.573849 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c"} err="failed to get container status \"fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c\": rpc error: code = NotFound desc = could not find container \"fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c\": container with ID starting with fac3be285a89882682750b59648b2abfc36cbd4fdf18f7404d839586bb0b029c not found: ID does not exist" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.573892 5127 scope.go:117] "RemoveContainer" containerID="4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a" Feb 01 08:41:54 crc kubenswrapper[5127]: E0201 08:41:54.581979 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a\": container with ID starting with 4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a not found: ID does not exist" containerID="4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.582046 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a"} err="failed to get container status \"4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a\": rpc error: code = NotFound desc = could not find container \"4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a\": container with ID starting with 4f3ade883aa49898d1200bdfcfd6966b2097ed1731e368fac18046d28744188a not found: ID does not exist" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.582075 5127 scope.go:117] "RemoveContainer" containerID="279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db" Feb 01 08:41:54 crc kubenswrapper[5127]: E0201 08:41:54.589713 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db\": container with ID starting with 279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db not found: ID does not exist" containerID="279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db" Feb 01 08:41:54 crc kubenswrapper[5127]: I0201 08:41:54.589767 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db"} err="failed to get container status \"279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db\": rpc error: code = NotFound desc = could not find container \"279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db\": container with ID starting with 279a47ae2e2f216f280a9439229f5a189a7a201124bec42375558ed6cc59e2db not found: ID does not exist" Feb 01 08:41:56 crc kubenswrapper[5127]: I0201 08:41:56.250495 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" path="/var/lib/kubelet/pods/f1ea4aa4-b3ed-418d-a0cf-a594c6534011/volumes" Feb 01 08:41:56 crc kubenswrapper[5127]: I0201 08:41:56.918881 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:41:56 crc kubenswrapper[5127]: I0201 08:41:56.980845 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-896bdd75c-j4ptz"] Feb 01 08:41:56 crc kubenswrapper[5127]: I0201 08:41:56.981210 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" podUID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerName="dnsmasq-dns" containerID="cri-o://b9ba7a45f43976276c4e449cd305f5884f96110b8b919c8a267daafd50c7106d" gracePeriod=10 Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.383837 5127 generic.go:334] "Generic (PLEG): container finished" podID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerID="b9ba7a45f43976276c4e449cd305f5884f96110b8b919c8a267daafd50c7106d" exitCode=0 Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.384027 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" event={"ID":"37cdb208-65b0-42ca-b90f-7b7246e58c55","Type":"ContainerDied","Data":"b9ba7a45f43976276c4e449cd305f5884f96110b8b919c8a267daafd50c7106d"} Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.553484 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.639299 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-dns-svc\") pod \"37cdb208-65b0-42ca-b90f-7b7246e58c55\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.639504 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcqgm\" (UniqueName: \"kubernetes.io/projected/37cdb208-65b0-42ca-b90f-7b7246e58c55-kube-api-access-dcqgm\") pod \"37cdb208-65b0-42ca-b90f-7b7246e58c55\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.639558 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-sb\") pod \"37cdb208-65b0-42ca-b90f-7b7246e58c55\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.639682 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-config\") pod \"37cdb208-65b0-42ca-b90f-7b7246e58c55\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.639752 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-nb\") pod \"37cdb208-65b0-42ca-b90f-7b7246e58c55\" (UID: \"37cdb208-65b0-42ca-b90f-7b7246e58c55\") " Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.651791 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37cdb208-65b0-42ca-b90f-7b7246e58c55-kube-api-access-dcqgm" (OuterVolumeSpecName: "kube-api-access-dcqgm") pod "37cdb208-65b0-42ca-b90f-7b7246e58c55" (UID: "37cdb208-65b0-42ca-b90f-7b7246e58c55"). InnerVolumeSpecName "kube-api-access-dcqgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.687476 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37cdb208-65b0-42ca-b90f-7b7246e58c55" (UID: "37cdb208-65b0-42ca-b90f-7b7246e58c55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.688902 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37cdb208-65b0-42ca-b90f-7b7246e58c55" (UID: "37cdb208-65b0-42ca-b90f-7b7246e58c55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.723743 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-config" (OuterVolumeSpecName: "config") pod "37cdb208-65b0-42ca-b90f-7b7246e58c55" (UID: "37cdb208-65b0-42ca-b90f-7b7246e58c55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.723831 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37cdb208-65b0-42ca-b90f-7b7246e58c55" (UID: "37cdb208-65b0-42ca-b90f-7b7246e58c55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.741972 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcqgm\" (UniqueName: \"kubernetes.io/projected/37cdb208-65b0-42ca-b90f-7b7246e58c55-kube-api-access-dcqgm\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.742013 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.742025 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.742037 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:57 crc kubenswrapper[5127]: I0201 08:41:57.742046 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37cdb208-65b0-42ca-b90f-7b7246e58c55-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.415194 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" event={"ID":"37cdb208-65b0-42ca-b90f-7b7246e58c55","Type":"ContainerDied","Data":"46438fc15cb880365afb319b0c48f4293a008e535bfc03abb4c1106d042474b3"} Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.415699 5127 scope.go:117] "RemoveContainer" containerID="b9ba7a45f43976276c4e449cd305f5884f96110b8b919c8a267daafd50c7106d" Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.415889 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-896bdd75c-j4ptz" Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.452046 5127 scope.go:117] "RemoveContainer" containerID="b18b347e2c927d1858c317854321dda4ed37cbfcb7d93cb3aa516db573eb3134" Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.464564 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-896bdd75c-j4ptz"] Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.471692 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-896bdd75c-j4ptz"] Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.726448 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:41:58 crc kubenswrapper[5127]: I0201 08:41:58.751753 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58b7b9b55b-6s7jl" Feb 01 08:42:00 crc kubenswrapper[5127]: I0201 08:42:00.249335 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37cdb208-65b0-42ca-b90f-7b7246e58c55" path="/var/lib/kubelet/pods/37cdb208-65b0-42ca-b90f-7b7246e58c55/volumes" Feb 01 08:42:03 crc kubenswrapper[5127]: I0201 08:42:03.235522 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:42:03 crc kubenswrapper[5127]: E0201 08:42:03.236021 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.662848 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6rrqz"] Feb 01 08:42:05 crc kubenswrapper[5127]: E0201 08:42:05.663755 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerName="init" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.663775 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerName="init" Feb 01 08:42:05 crc kubenswrapper[5127]: E0201 08:42:05.663795 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="extract-content" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.663806 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="extract-content" Feb 01 08:42:05 crc kubenswrapper[5127]: E0201 08:42:05.663823 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="extract-utilities" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.663832 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="extract-utilities" Feb 01 08:42:05 crc kubenswrapper[5127]: E0201 08:42:05.663846 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="registry-server" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.663853 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="registry-server" Feb 01 08:42:05 crc kubenswrapper[5127]: E0201 08:42:05.663869 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerName="dnsmasq-dns" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.663880 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerName="dnsmasq-dns" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.664077 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ea4aa4-b3ed-418d-a0cf-a594c6534011" containerName="registry-server" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.664100 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="37cdb208-65b0-42ca-b90f-7b7246e58c55" containerName="dnsmasq-dns" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.664699 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.675432 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6rrqz"] Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.760249 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a9db-account-create-update-6zsjs"] Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.761646 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.763704 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.768035 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a9db-account-create-update-6zsjs"] Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.787483 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh785\" (UniqueName: \"kubernetes.io/projected/c75d0138-b829-4d7b-add2-c80532012c2d-kube-api-access-wh785\") pod \"neutron-db-create-6rrqz\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.787626 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75d0138-b829-4d7b-add2-c80532012c2d-operator-scripts\") pod \"neutron-db-create-6rrqz\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.889319 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh785\" (UniqueName: \"kubernetes.io/projected/c75d0138-b829-4d7b-add2-c80532012c2d-kube-api-access-wh785\") pod \"neutron-db-create-6rrqz\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.889382 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kt72\" (UniqueName: \"kubernetes.io/projected/8cc1de06-1d9c-4c95-979d-5c609693a411-kube-api-access-9kt72\") pod \"neutron-a9db-account-create-update-6zsjs\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.889422 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc1de06-1d9c-4c95-979d-5c609693a411-operator-scripts\") pod \"neutron-a9db-account-create-update-6zsjs\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.889462 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75d0138-b829-4d7b-add2-c80532012c2d-operator-scripts\") pod \"neutron-db-create-6rrqz\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.890380 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75d0138-b829-4d7b-add2-c80532012c2d-operator-scripts\") pod \"neutron-db-create-6rrqz\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.910017 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh785\" (UniqueName: \"kubernetes.io/projected/c75d0138-b829-4d7b-add2-c80532012c2d-kube-api-access-wh785\") pod \"neutron-db-create-6rrqz\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.986905 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.991260 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kt72\" (UniqueName: \"kubernetes.io/projected/8cc1de06-1d9c-4c95-979d-5c609693a411-kube-api-access-9kt72\") pod \"neutron-a9db-account-create-update-6zsjs\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.991316 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc1de06-1d9c-4c95-979d-5c609693a411-operator-scripts\") pod \"neutron-a9db-account-create-update-6zsjs\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:05 crc kubenswrapper[5127]: I0201 08:42:05.992002 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc1de06-1d9c-4c95-979d-5c609693a411-operator-scripts\") pod \"neutron-a9db-account-create-update-6zsjs\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:06 crc kubenswrapper[5127]: I0201 08:42:06.034305 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kt72\" (UniqueName: \"kubernetes.io/projected/8cc1de06-1d9c-4c95-979d-5c609693a411-kube-api-access-9kt72\") pod \"neutron-a9db-account-create-update-6zsjs\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:06 crc kubenswrapper[5127]: I0201 08:42:06.078300 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:06 crc kubenswrapper[5127]: I0201 08:42:06.371181 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a9db-account-create-update-6zsjs"] Feb 01 08:42:06 crc kubenswrapper[5127]: W0201 08:42:06.373519 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cc1de06_1d9c_4c95_979d_5c609693a411.slice/crio-40b36c268784d34cac212b80c84a5116e4eb74661723088d61800b2cb2e62a5e WatchSource:0}: Error finding container 40b36c268784d34cac212b80c84a5116e4eb74661723088d61800b2cb2e62a5e: Status 404 returned error can't find the container with id 40b36c268784d34cac212b80c84a5116e4eb74661723088d61800b2cb2e62a5e Feb 01 08:42:06 crc kubenswrapper[5127]: I0201 08:42:06.491199 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a9db-account-create-update-6zsjs" event={"ID":"8cc1de06-1d9c-4c95-979d-5c609693a411","Type":"ContainerStarted","Data":"40b36c268784d34cac212b80c84a5116e4eb74661723088d61800b2cb2e62a5e"} Feb 01 08:42:06 crc kubenswrapper[5127]: I0201 08:42:06.556119 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6rrqz"] Feb 01 08:42:06 crc kubenswrapper[5127]: W0201 08:42:06.558864 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc75d0138_b829_4d7b_add2_c80532012c2d.slice/crio-d09535710d046e0342bea6cfc1c47ee43740ae7392567bcf66b66451095fb122 WatchSource:0}: Error finding container d09535710d046e0342bea6cfc1c47ee43740ae7392567bcf66b66451095fb122: Status 404 returned error can't find the container with id d09535710d046e0342bea6cfc1c47ee43740ae7392567bcf66b66451095fb122 Feb 01 08:42:07 crc kubenswrapper[5127]: I0201 08:42:07.504170 5127 generic.go:334] "Generic (PLEG): container finished" podID="8cc1de06-1d9c-4c95-979d-5c609693a411" containerID="362d988508c520e49c106c4f15ee9025a16776bd1b9ba5bc6ea277195249315e" exitCode=0 Feb 01 08:42:07 crc kubenswrapper[5127]: I0201 08:42:07.504304 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a9db-account-create-update-6zsjs" event={"ID":"8cc1de06-1d9c-4c95-979d-5c609693a411","Type":"ContainerDied","Data":"362d988508c520e49c106c4f15ee9025a16776bd1b9ba5bc6ea277195249315e"} Feb 01 08:42:07 crc kubenswrapper[5127]: I0201 08:42:07.509025 5127 generic.go:334] "Generic (PLEG): container finished" podID="c75d0138-b829-4d7b-add2-c80532012c2d" containerID="82aec21c60360ef43c9ee5de5258cab62b87f4e99a37a14fb0cbc052b4bc4ef2" exitCode=0 Feb 01 08:42:07 crc kubenswrapper[5127]: I0201 08:42:07.509085 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6rrqz" event={"ID":"c75d0138-b829-4d7b-add2-c80532012c2d","Type":"ContainerDied","Data":"82aec21c60360ef43c9ee5de5258cab62b87f4e99a37a14fb0cbc052b4bc4ef2"} Feb 01 08:42:07 crc kubenswrapper[5127]: I0201 08:42:07.509130 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6rrqz" event={"ID":"c75d0138-b829-4d7b-add2-c80532012c2d","Type":"ContainerStarted","Data":"d09535710d046e0342bea6cfc1c47ee43740ae7392567bcf66b66451095fb122"} Feb 01 08:42:08 crc kubenswrapper[5127]: I0201 08:42:08.949977 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:08 crc kubenswrapper[5127]: I0201 08:42:08.954975 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.071077 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc1de06-1d9c-4c95-979d-5c609693a411-operator-scripts\") pod \"8cc1de06-1d9c-4c95-979d-5c609693a411\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.071415 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kt72\" (UniqueName: \"kubernetes.io/projected/8cc1de06-1d9c-4c95-979d-5c609693a411-kube-api-access-9kt72\") pod \"8cc1de06-1d9c-4c95-979d-5c609693a411\" (UID: \"8cc1de06-1d9c-4c95-979d-5c609693a411\") " Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.071551 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75d0138-b829-4d7b-add2-c80532012c2d-operator-scripts\") pod \"c75d0138-b829-4d7b-add2-c80532012c2d\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.071664 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh785\" (UniqueName: \"kubernetes.io/projected/c75d0138-b829-4d7b-add2-c80532012c2d-kube-api-access-wh785\") pod \"c75d0138-b829-4d7b-add2-c80532012c2d\" (UID: \"c75d0138-b829-4d7b-add2-c80532012c2d\") " Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.071836 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cc1de06-1d9c-4c95-979d-5c609693a411-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8cc1de06-1d9c-4c95-979d-5c609693a411" (UID: "8cc1de06-1d9c-4c95-979d-5c609693a411"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.072437 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8cc1de06-1d9c-4c95-979d-5c609693a411-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.072484 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75d0138-b829-4d7b-add2-c80532012c2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c75d0138-b829-4d7b-add2-c80532012c2d" (UID: "c75d0138-b829-4d7b-add2-c80532012c2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.077643 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75d0138-b829-4d7b-add2-c80532012c2d-kube-api-access-wh785" (OuterVolumeSpecName: "kube-api-access-wh785") pod "c75d0138-b829-4d7b-add2-c80532012c2d" (UID: "c75d0138-b829-4d7b-add2-c80532012c2d"). InnerVolumeSpecName "kube-api-access-wh785". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.077925 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc1de06-1d9c-4c95-979d-5c609693a411-kube-api-access-9kt72" (OuterVolumeSpecName: "kube-api-access-9kt72") pod "8cc1de06-1d9c-4c95-979d-5c609693a411" (UID: "8cc1de06-1d9c-4c95-979d-5c609693a411"). InnerVolumeSpecName "kube-api-access-9kt72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.174861 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c75d0138-b829-4d7b-add2-c80532012c2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.174917 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh785\" (UniqueName: \"kubernetes.io/projected/c75d0138-b829-4d7b-add2-c80532012c2d-kube-api-access-wh785\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.174937 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kt72\" (UniqueName: \"kubernetes.io/projected/8cc1de06-1d9c-4c95-979d-5c609693a411-kube-api-access-9kt72\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.538789 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6rrqz" event={"ID":"c75d0138-b829-4d7b-add2-c80532012c2d","Type":"ContainerDied","Data":"d09535710d046e0342bea6cfc1c47ee43740ae7392567bcf66b66451095fb122"} Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.538836 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6rrqz" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.538853 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d09535710d046e0342bea6cfc1c47ee43740ae7392567bcf66b66451095fb122" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.541870 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a9db-account-create-update-6zsjs" event={"ID":"8cc1de06-1d9c-4c95-979d-5c609693a411","Type":"ContainerDied","Data":"40b36c268784d34cac212b80c84a5116e4eb74661723088d61800b2cb2e62a5e"} Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.541920 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40b36c268784d34cac212b80c84a5116e4eb74661723088d61800b2cb2e62a5e" Feb 01 08:42:09 crc kubenswrapper[5127]: I0201 08:42:09.541932 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a9db-account-create-update-6zsjs" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.967907 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-z5d2n"] Feb 01 08:42:10 crc kubenswrapper[5127]: E0201 08:42:10.968781 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75d0138-b829-4d7b-add2-c80532012c2d" containerName="mariadb-database-create" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.968804 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75d0138-b829-4d7b-add2-c80532012c2d" containerName="mariadb-database-create" Feb 01 08:42:10 crc kubenswrapper[5127]: E0201 08:42:10.968835 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc1de06-1d9c-4c95-979d-5c609693a411" containerName="mariadb-account-create-update" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.968844 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc1de06-1d9c-4c95-979d-5c609693a411" containerName="mariadb-account-create-update" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.969073 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc1de06-1d9c-4c95-979d-5c609693a411" containerName="mariadb-account-create-update" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.969117 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75d0138-b829-4d7b-add2-c80532012c2d" containerName="mariadb-database-create" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.969905 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.976019 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.976295 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.976512 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hfvkj" Feb 01 08:42:10 crc kubenswrapper[5127]: I0201 08:42:10.996095 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z5d2n"] Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.111145 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-combined-ca-bundle\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.111493 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-config\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.111770 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xq4b\" (UniqueName: \"kubernetes.io/projected/2f7056c1-d866-4310-abe7-fd62cab68866-kube-api-access-6xq4b\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.214354 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xq4b\" (UniqueName: \"kubernetes.io/projected/2f7056c1-d866-4310-abe7-fd62cab68866-kube-api-access-6xq4b\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.214456 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-combined-ca-bundle\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.214552 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-config\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.220161 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-config\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.221362 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-combined-ca-bundle\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.245033 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xq4b\" (UniqueName: \"kubernetes.io/projected/2f7056c1-d866-4310-abe7-fd62cab68866-kube-api-access-6xq4b\") pod \"neutron-db-sync-z5d2n\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.294521 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:11 crc kubenswrapper[5127]: I0201 08:42:11.591508 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-z5d2n"] Feb 01 08:42:12 crc kubenswrapper[5127]: I0201 08:42:12.581058 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z5d2n" event={"ID":"2f7056c1-d866-4310-abe7-fd62cab68866","Type":"ContainerStarted","Data":"d6ba0be7c47ffbdb522bfc579d6a9b4d1a5c7e96c967b09176983bd26cd56f4a"} Feb 01 08:42:12 crc kubenswrapper[5127]: I0201 08:42:12.581413 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z5d2n" event={"ID":"2f7056c1-d866-4310-abe7-fd62cab68866","Type":"ContainerStarted","Data":"8b4470e79d4df60e0eb6fbbd8eee63a8b4d250d4da7b7f12ee344ec8bf135dd4"} Feb 01 08:42:12 crc kubenswrapper[5127]: I0201 08:42:12.614674 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-z5d2n" podStartSLOduration=2.61464855 podStartE2EDuration="2.61464855s" podCreationTimestamp="2026-02-01 08:42:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:42:12.6038425 +0000 UTC m=+6883.089744903" watchObservedRunningTime="2026-02-01 08:42:12.61464855 +0000 UTC m=+6883.100550953" Feb 01 08:42:15 crc kubenswrapper[5127]: I0201 08:42:15.235705 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:42:15 crc kubenswrapper[5127]: E0201 08:42:15.236475 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:42:16 crc kubenswrapper[5127]: I0201 08:42:16.630691 5127 generic.go:334] "Generic (PLEG): container finished" podID="2f7056c1-d866-4310-abe7-fd62cab68866" containerID="d6ba0be7c47ffbdb522bfc579d6a9b4d1a5c7e96c967b09176983bd26cd56f4a" exitCode=0 Feb 01 08:42:16 crc kubenswrapper[5127]: I0201 08:42:16.630738 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z5d2n" event={"ID":"2f7056c1-d866-4310-abe7-fd62cab68866","Type":"ContainerDied","Data":"d6ba0be7c47ffbdb522bfc579d6a9b4d1a5c7e96c967b09176983bd26cd56f4a"} Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.020993 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.150768 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-config\") pod \"2f7056c1-d866-4310-abe7-fd62cab68866\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.150977 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-combined-ca-bundle\") pod \"2f7056c1-d866-4310-abe7-fd62cab68866\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.151930 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xq4b\" (UniqueName: \"kubernetes.io/projected/2f7056c1-d866-4310-abe7-fd62cab68866-kube-api-access-6xq4b\") pod \"2f7056c1-d866-4310-abe7-fd62cab68866\" (UID: \"2f7056c1-d866-4310-abe7-fd62cab68866\") " Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.158027 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f7056c1-d866-4310-abe7-fd62cab68866-kube-api-access-6xq4b" (OuterVolumeSpecName: "kube-api-access-6xq4b") pod "2f7056c1-d866-4310-abe7-fd62cab68866" (UID: "2f7056c1-d866-4310-abe7-fd62cab68866"). InnerVolumeSpecName "kube-api-access-6xq4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.175047 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f7056c1-d866-4310-abe7-fd62cab68866" (UID: "2f7056c1-d866-4310-abe7-fd62cab68866"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.176421 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-config" (OuterVolumeSpecName: "config") pod "2f7056c1-d866-4310-abe7-fd62cab68866" (UID: "2f7056c1-d866-4310-abe7-fd62cab68866"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.253859 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.253891 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f7056c1-d866-4310-abe7-fd62cab68866-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.253903 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xq4b\" (UniqueName: \"kubernetes.io/projected/2f7056c1-d866-4310-abe7-fd62cab68866-kube-api-access-6xq4b\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.674425 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-z5d2n" event={"ID":"2f7056c1-d866-4310-abe7-fd62cab68866","Type":"ContainerDied","Data":"8b4470e79d4df60e0eb6fbbd8eee63a8b4d250d4da7b7f12ee344ec8bf135dd4"} Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.674477 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b4470e79d4df60e0eb6fbbd8eee63a8b4d250d4da7b7f12ee344ec8bf135dd4" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.674573 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-z5d2n" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.946525 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5746cb9b6f-hrzzc"] Feb 01 08:42:18 crc kubenswrapper[5127]: E0201 08:42:18.947533 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f7056c1-d866-4310-abe7-fd62cab68866" containerName="neutron-db-sync" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.947554 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7056c1-d866-4310-abe7-fd62cab68866" containerName="neutron-db-sync" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.949742 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f7056c1-d866-4310-abe7-fd62cab68866" containerName="neutron-db-sync" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.951368 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:18 crc kubenswrapper[5127]: I0201 08:42:18.974546 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5746cb9b6f-hrzzc"] Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.024070 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7467ffb6b7-d4mz8"] Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.025982 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.029775 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hfvkj" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.029988 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.030369 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.030537 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7467ffb6b7-d4mz8"] Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.069825 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-sb\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.069880 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-nb\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.069918 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-config\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.069997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-dns-svc\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.070045 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq44g\" (UniqueName: \"kubernetes.io/projected/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-kube-api-access-mq44g\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.172061 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-config\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.172132 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd87c\" (UniqueName: \"kubernetes.io/projected/eb4805fb-4027-4dd8-980d-cc5004dac4f3-kube-api-access-jd87c\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.172231 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-httpd-config\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.172254 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-combined-ca-bundle\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.172396 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-dns-svc\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.173042 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq44g\" (UniqueName: \"kubernetes.io/projected/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-kube-api-access-mq44g\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.173168 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-sb\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.173199 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-nb\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.173303 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-config\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.173511 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-config\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.174237 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-sb\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.174294 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-dns-svc\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.174923 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-nb\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.192444 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq44g\" (UniqueName: \"kubernetes.io/projected/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-kube-api-access-mq44g\") pod \"dnsmasq-dns-5746cb9b6f-hrzzc\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.275494 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-config\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.275546 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd87c\" (UniqueName: \"kubernetes.io/projected/eb4805fb-4027-4dd8-980d-cc5004dac4f3-kube-api-access-jd87c\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.275629 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-httpd-config\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.275652 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-combined-ca-bundle\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.282068 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-httpd-config\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.282712 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-config\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.290531 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.293608 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4805fb-4027-4dd8-980d-cc5004dac4f3-combined-ca-bundle\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.302755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd87c\" (UniqueName: \"kubernetes.io/projected/eb4805fb-4027-4dd8-980d-cc5004dac4f3-kube-api-access-jd87c\") pod \"neutron-7467ffb6b7-d4mz8\" (UID: \"eb4805fb-4027-4dd8-980d-cc5004dac4f3\") " pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.371699 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.944008 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7467ffb6b7-d4mz8"] Feb 01 08:42:19 crc kubenswrapper[5127]: W0201 08:42:19.977577 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb41aa7ab_59fa_4b38_9927_a7c4d936cfa7.slice/crio-ab2bac67ed32f2188ec9cbf8f4872657d2900ed4e517d3c5df8f90cb530f23db WatchSource:0}: Error finding container ab2bac67ed32f2188ec9cbf8f4872657d2900ed4e517d3c5df8f90cb530f23db: Status 404 returned error can't find the container with id ab2bac67ed32f2188ec9cbf8f4872657d2900ed4e517d3c5df8f90cb530f23db Feb 01 08:42:19 crc kubenswrapper[5127]: I0201 08:42:19.978912 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5746cb9b6f-hrzzc"] Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.710410 5127 generic.go:334] "Generic (PLEG): container finished" podID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerID="a108f8e2ab869da7ca39c1f51e9bacdb72bb32b915dced6111b292aad07ed53c" exitCode=0 Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.710481 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" event={"ID":"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7","Type":"ContainerDied","Data":"a108f8e2ab869da7ca39c1f51e9bacdb72bb32b915dced6111b292aad07ed53c"} Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.711408 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" event={"ID":"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7","Type":"ContainerStarted","Data":"ab2bac67ed32f2188ec9cbf8f4872657d2900ed4e517d3c5df8f90cb530f23db"} Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.716896 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7467ffb6b7-d4mz8" event={"ID":"eb4805fb-4027-4dd8-980d-cc5004dac4f3","Type":"ContainerStarted","Data":"5f45d5da8697b8b416df9887989070917945f1e1fa4681d262394f65709f26dd"} Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.716941 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7467ffb6b7-d4mz8" event={"ID":"eb4805fb-4027-4dd8-980d-cc5004dac4f3","Type":"ContainerStarted","Data":"27477f5b3b5d12db8fcb70d0e47238571a43eb4a663451f474cdec6c3699eac2"} Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.716951 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7467ffb6b7-d4mz8" event={"ID":"eb4805fb-4027-4dd8-980d-cc5004dac4f3","Type":"ContainerStarted","Data":"9a6ec487eeddb00725266bbda65ebf3f31055d116e31d46a7cb12cdc8e6b9297"} Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.717067 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:20 crc kubenswrapper[5127]: I0201 08:42:20.786964 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7467ffb6b7-d4mz8" podStartSLOduration=2.7869397510000002 podStartE2EDuration="2.786939751s" podCreationTimestamp="2026-02-01 08:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:42:20.777945619 +0000 UTC m=+6891.263848002" watchObservedRunningTime="2026-02-01 08:42:20.786939751 +0000 UTC m=+6891.272842124" Feb 01 08:42:21 crc kubenswrapper[5127]: I0201 08:42:21.730424 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" event={"ID":"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7","Type":"ContainerStarted","Data":"f547792f9611b04d30f30dd56e3f2c7128e1738f7ea4294ca325640d5ced28e0"} Feb 01 08:42:21 crc kubenswrapper[5127]: I0201 08:42:21.731422 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:21 crc kubenswrapper[5127]: I0201 08:42:21.752763 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" podStartSLOduration=3.752744036 podStartE2EDuration="3.752744036s" podCreationTimestamp="2026-02-01 08:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:42:21.752480819 +0000 UTC m=+6892.238383182" watchObservedRunningTime="2026-02-01 08:42:21.752744036 +0000 UTC m=+6892.238646399" Feb 01 08:42:28 crc kubenswrapper[5127]: I0201 08:42:28.236254 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:42:28 crc kubenswrapper[5127]: E0201 08:42:28.237464 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:42:29 crc kubenswrapper[5127]: I0201 08:42:29.292777 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:42:29 crc kubenswrapper[5127]: I0201 08:42:29.558249 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b677fd8d7-bbt5x"] Feb 01 08:42:29 crc kubenswrapper[5127]: I0201 08:42:29.558945 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" podUID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerName="dnsmasq-dns" containerID="cri-o://6f6be3da71040047c9bc7a7f5ef8830e81609bb20b30f227ea15967c34d4eb5b" gracePeriod=10 Feb 01 08:42:29 crc kubenswrapper[5127]: I0201 08:42:29.818467 5127 generic.go:334] "Generic (PLEG): container finished" podID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerID="6f6be3da71040047c9bc7a7f5ef8830e81609bb20b30f227ea15967c34d4eb5b" exitCode=0 Feb 01 08:42:29 crc kubenswrapper[5127]: I0201 08:42:29.818510 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" event={"ID":"2b71952a-c8b5-4778-8834-ffe4aa043fe1","Type":"ContainerDied","Data":"6f6be3da71040047c9bc7a7f5ef8830e81609bb20b30f227ea15967c34d4eb5b"} Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.074721 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.102829 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-sb\") pod \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.102971 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-dns-svc\") pod \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.103027 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-nb\") pod \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.103078 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-config\") pod \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.103133 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb9w7\" (UniqueName: \"kubernetes.io/projected/2b71952a-c8b5-4778-8834-ffe4aa043fe1-kube-api-access-mb9w7\") pod \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\" (UID: \"2b71952a-c8b5-4778-8834-ffe4aa043fe1\") " Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.113849 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b71952a-c8b5-4778-8834-ffe4aa043fe1-kube-api-access-mb9w7" (OuterVolumeSpecName: "kube-api-access-mb9w7") pod "2b71952a-c8b5-4778-8834-ffe4aa043fe1" (UID: "2b71952a-c8b5-4778-8834-ffe4aa043fe1"). InnerVolumeSpecName "kube-api-access-mb9w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.149477 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b71952a-c8b5-4778-8834-ffe4aa043fe1" (UID: "2b71952a-c8b5-4778-8834-ffe4aa043fe1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.157671 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-config" (OuterVolumeSpecName: "config") pod "2b71952a-c8b5-4778-8834-ffe4aa043fe1" (UID: "2b71952a-c8b5-4778-8834-ffe4aa043fe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.159791 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b71952a-c8b5-4778-8834-ffe4aa043fe1" (UID: "2b71952a-c8b5-4778-8834-ffe4aa043fe1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.186043 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b71952a-c8b5-4778-8834-ffe4aa043fe1" (UID: "2b71952a-c8b5-4778-8834-ffe4aa043fe1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.207362 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.207417 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.207436 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.207451 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb9w7\" (UniqueName: \"kubernetes.io/projected/2b71952a-c8b5-4778-8834-ffe4aa043fe1-kube-api-access-mb9w7\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.207463 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b71952a-c8b5-4778-8834-ffe4aa043fe1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.826632 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" event={"ID":"2b71952a-c8b5-4778-8834-ffe4aa043fe1","Type":"ContainerDied","Data":"278d388b7eb0df72b8244c4a6cf23515b0ba206504a264552bcacbba92748c8b"} Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.826688 5127 scope.go:117] "RemoveContainer" containerID="6f6be3da71040047c9bc7a7f5ef8830e81609bb20b30f227ea15967c34d4eb5b" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.826842 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b677fd8d7-bbt5x" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.850065 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b677fd8d7-bbt5x"] Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.852010 5127 scope.go:117] "RemoveContainer" containerID="2354e9a04c2a2db93de3b55b31a372c3f79b263fcce27f165097674487b4763b" Feb 01 08:42:30 crc kubenswrapper[5127]: I0201 08:42:30.858754 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b677fd8d7-bbt5x"] Feb 01 08:42:32 crc kubenswrapper[5127]: I0201 08:42:32.254534 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" path="/var/lib/kubelet/pods/2b71952a-c8b5-4778-8834-ffe4aa043fe1/volumes" Feb 01 08:42:33 crc kubenswrapper[5127]: E0201 08:42:33.642445 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b71952a_c8b5_4778_8834_ffe4aa043fe1.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:42:40 crc kubenswrapper[5127]: I0201 08:42:40.246921 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:42:40 crc kubenswrapper[5127]: E0201 08:42:40.249571 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.300361 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fzbwr"] Feb 01 08:42:42 crc kubenswrapper[5127]: E0201 08:42:42.317467 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerName="dnsmasq-dns" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.317864 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerName="dnsmasq-dns" Feb 01 08:42:42 crc kubenswrapper[5127]: E0201 08:42:42.317882 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerName="init" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.317891 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerName="init" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.318222 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b71952a-c8b5-4778-8834-ffe4aa043fe1" containerName="dnsmasq-dns" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.327041 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fzbwr"] Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.327171 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.380741 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-catalog-content\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.380819 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kjxb\" (UniqueName: \"kubernetes.io/projected/62590299-9f22-4a35-aca7-25deefd70b3b-kube-api-access-6kjxb\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.381058 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-utilities\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.482548 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-utilities\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.482664 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-catalog-content\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.482736 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kjxb\" (UniqueName: \"kubernetes.io/projected/62590299-9f22-4a35-aca7-25deefd70b3b-kube-api-access-6kjxb\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.483399 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-utilities\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.483550 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-catalog-content\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.510126 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kjxb\" (UniqueName: \"kubernetes.io/projected/62590299-9f22-4a35-aca7-25deefd70b3b-kube-api-access-6kjxb\") pod \"certified-operators-fzbwr\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:42 crc kubenswrapper[5127]: I0201 08:42:42.667360 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:43 crc kubenswrapper[5127]: I0201 08:42:43.209780 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fzbwr"] Feb 01 08:42:43 crc kubenswrapper[5127]: W0201 08:42:43.219883 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62590299_9f22_4a35_aca7_25deefd70b3b.slice/crio-ad5edaf292913d5f66221d600022cae59e0c399090b420736a83783d54f35964 WatchSource:0}: Error finding container ad5edaf292913d5f66221d600022cae59e0c399090b420736a83783d54f35964: Status 404 returned error can't find the container with id ad5edaf292913d5f66221d600022cae59e0c399090b420736a83783d54f35964 Feb 01 08:42:43 crc kubenswrapper[5127]: E0201 08:42:43.870468 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b71952a_c8b5_4778_8834_ffe4aa043fe1.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:42:44 crc kubenswrapper[5127]: I0201 08:42:44.021168 5127 generic.go:334] "Generic (PLEG): container finished" podID="62590299-9f22-4a35-aca7-25deefd70b3b" containerID="a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897" exitCode=0 Feb 01 08:42:44 crc kubenswrapper[5127]: I0201 08:42:44.021214 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzbwr" event={"ID":"62590299-9f22-4a35-aca7-25deefd70b3b","Type":"ContainerDied","Data":"a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897"} Feb 01 08:42:44 crc kubenswrapper[5127]: I0201 08:42:44.021241 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzbwr" event={"ID":"62590299-9f22-4a35-aca7-25deefd70b3b","Type":"ContainerStarted","Data":"ad5edaf292913d5f66221d600022cae59e0c399090b420736a83783d54f35964"} Feb 01 08:42:45 crc kubenswrapper[5127]: I0201 08:42:45.033689 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzbwr" event={"ID":"62590299-9f22-4a35-aca7-25deefd70b3b","Type":"ContainerStarted","Data":"466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b"} Feb 01 08:42:46 crc kubenswrapper[5127]: I0201 08:42:46.064077 5127 generic.go:334] "Generic (PLEG): container finished" podID="62590299-9f22-4a35-aca7-25deefd70b3b" containerID="466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b" exitCode=0 Feb 01 08:42:46 crc kubenswrapper[5127]: I0201 08:42:46.064181 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzbwr" event={"ID":"62590299-9f22-4a35-aca7-25deefd70b3b","Type":"ContainerDied","Data":"466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b"} Feb 01 08:42:47 crc kubenswrapper[5127]: I0201 08:42:47.079145 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzbwr" event={"ID":"62590299-9f22-4a35-aca7-25deefd70b3b","Type":"ContainerStarted","Data":"57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1"} Feb 01 08:42:47 crc kubenswrapper[5127]: I0201 08:42:47.104425 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fzbwr" podStartSLOduration=2.637837808 podStartE2EDuration="5.104395209s" podCreationTimestamp="2026-02-01 08:42:42 +0000 UTC" firstStartedPulling="2026-02-01 08:42:44.022733218 +0000 UTC m=+6914.508635581" lastFinishedPulling="2026-02-01 08:42:46.489290609 +0000 UTC m=+6916.975192982" observedRunningTime="2026-02-01 08:42:47.096984029 +0000 UTC m=+6917.582886402" watchObservedRunningTime="2026-02-01 08:42:47.104395209 +0000 UTC m=+6917.590297592" Feb 01 08:42:49 crc kubenswrapper[5127]: I0201 08:42:49.385181 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7467ffb6b7-d4mz8" Feb 01 08:42:52 crc kubenswrapper[5127]: I0201 08:42:52.667542 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:52 crc kubenswrapper[5127]: I0201 08:42:52.668419 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:52 crc kubenswrapper[5127]: I0201 08:42:52.748822 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:53 crc kubenswrapper[5127]: I0201 08:42:53.184639 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:53 crc kubenswrapper[5127]: I0201 08:42:53.236469 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:42:53 crc kubenswrapper[5127]: E0201 08:42:53.236870 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:42:53 crc kubenswrapper[5127]: I0201 08:42:53.248761 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fzbwr"] Feb 01 08:42:54 crc kubenswrapper[5127]: E0201 08:42:54.168000 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b71952a_c8b5_4778_8834_ffe4aa043fe1.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.153526 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fzbwr" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="registry-server" containerID="cri-o://57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1" gracePeriod=2 Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.684749 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.784131 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kjxb\" (UniqueName: \"kubernetes.io/projected/62590299-9f22-4a35-aca7-25deefd70b3b-kube-api-access-6kjxb\") pod \"62590299-9f22-4a35-aca7-25deefd70b3b\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.784212 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-utilities\") pod \"62590299-9f22-4a35-aca7-25deefd70b3b\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.784262 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-catalog-content\") pod \"62590299-9f22-4a35-aca7-25deefd70b3b\" (UID: \"62590299-9f22-4a35-aca7-25deefd70b3b\") " Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.791457 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-utilities" (OuterVolumeSpecName: "utilities") pod "62590299-9f22-4a35-aca7-25deefd70b3b" (UID: "62590299-9f22-4a35-aca7-25deefd70b3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.800894 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62590299-9f22-4a35-aca7-25deefd70b3b-kube-api-access-6kjxb" (OuterVolumeSpecName: "kube-api-access-6kjxb") pod "62590299-9f22-4a35-aca7-25deefd70b3b" (UID: "62590299-9f22-4a35-aca7-25deefd70b3b"). InnerVolumeSpecName "kube-api-access-6kjxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.846246 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62590299-9f22-4a35-aca7-25deefd70b3b" (UID: "62590299-9f22-4a35-aca7-25deefd70b3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.885999 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kjxb\" (UniqueName: \"kubernetes.io/projected/62590299-9f22-4a35-aca7-25deefd70b3b-kube-api-access-6kjxb\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.886041 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:55 crc kubenswrapper[5127]: I0201 08:42:55.886055 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62590299-9f22-4a35-aca7-25deefd70b3b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.164811 5127 generic.go:334] "Generic (PLEG): container finished" podID="62590299-9f22-4a35-aca7-25deefd70b3b" containerID="57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1" exitCode=0 Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.164924 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fzbwr" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.164888 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzbwr" event={"ID":"62590299-9f22-4a35-aca7-25deefd70b3b","Type":"ContainerDied","Data":"57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1"} Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.165136 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fzbwr" event={"ID":"62590299-9f22-4a35-aca7-25deefd70b3b","Type":"ContainerDied","Data":"ad5edaf292913d5f66221d600022cae59e0c399090b420736a83783d54f35964"} Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.165168 5127 scope.go:117] "RemoveContainer" containerID="57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.191101 5127 scope.go:117] "RemoveContainer" containerID="466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.208804 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fzbwr"] Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.225748 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fzbwr"] Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.235441 5127 scope.go:117] "RemoveContainer" containerID="a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.251474 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" path="/var/lib/kubelet/pods/62590299-9f22-4a35-aca7-25deefd70b3b/volumes" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.255905 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9mdcv"] Feb 01 08:42:56 crc kubenswrapper[5127]: E0201 08:42:56.256268 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="registry-server" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.256288 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="registry-server" Feb 01 08:42:56 crc kubenswrapper[5127]: E0201 08:42:56.256305 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="extract-content" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.256315 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="extract-content" Feb 01 08:42:56 crc kubenswrapper[5127]: E0201 08:42:56.256338 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="extract-utilities" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.256430 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="extract-utilities" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.257459 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="62590299-9f22-4a35-aca7-25deefd70b3b" containerName="registry-server" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.258185 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.264458 5127 scope.go:117] "RemoveContainer" containerID="57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1" Feb 01 08:42:56 crc kubenswrapper[5127]: E0201 08:42:56.271225 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1\": container with ID starting with 57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1 not found: ID does not exist" containerID="57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.271266 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1"} err="failed to get container status \"57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1\": rpc error: code = NotFound desc = could not find container \"57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1\": container with ID starting with 57c29acb47bf2c4fd53d8064fb3a57f077b8d1d2f9a61ffc6055b7de2fb181e1 not found: ID does not exist" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.271288 5127 scope.go:117] "RemoveContainer" containerID="466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b" Feb 01 08:42:56 crc kubenswrapper[5127]: E0201 08:42:56.271718 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b\": container with ID starting with 466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b not found: ID does not exist" containerID="466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.271740 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b"} err="failed to get container status \"466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b\": rpc error: code = NotFound desc = could not find container \"466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b\": container with ID starting with 466f4bc136248c247b25f5831e284c87dae222d2245a754355c7ab22f691558b not found: ID does not exist" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.271752 5127 scope.go:117] "RemoveContainer" containerID="a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897" Feb 01 08:42:56 crc kubenswrapper[5127]: E0201 08:42:56.272093 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897\": container with ID starting with a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897 not found: ID does not exist" containerID="a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.272130 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897"} err="failed to get container status \"a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897\": rpc error: code = NotFound desc = could not find container \"a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897\": container with ID starting with a5b9e4805e61d6a21b605b738a3ba6de0847ba4f129eedce4c00347f6de47897 not found: ID does not exist" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.278801 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9mdcv"] Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.292348 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d6ff443-99a7-45a7-8fae-ed6495357ab0-operator-scripts\") pod \"glance-db-create-9mdcv\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.292540 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rknfh\" (UniqueName: \"kubernetes.io/projected/0d6ff443-99a7-45a7-8fae-ed6495357ab0-kube-api-access-rknfh\") pod \"glance-db-create-9mdcv\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.338379 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-21bb-account-create-update-ks2wx"] Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.339732 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.344206 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.347140 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-21bb-account-create-update-ks2wx"] Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.394877 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rknfh\" (UniqueName: \"kubernetes.io/projected/0d6ff443-99a7-45a7-8fae-ed6495357ab0-kube-api-access-rknfh\") pod \"glance-db-create-9mdcv\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.394976 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13dd2e31-d7bd-4984-bc91-480330e01ed1-operator-scripts\") pod \"glance-21bb-account-create-update-ks2wx\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.395006 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzqqn\" (UniqueName: \"kubernetes.io/projected/13dd2e31-d7bd-4984-bc91-480330e01ed1-kube-api-access-hzqqn\") pod \"glance-21bb-account-create-update-ks2wx\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.395046 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d6ff443-99a7-45a7-8fae-ed6495357ab0-operator-scripts\") pod \"glance-db-create-9mdcv\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.395791 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d6ff443-99a7-45a7-8fae-ed6495357ab0-operator-scripts\") pod \"glance-db-create-9mdcv\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.412684 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rknfh\" (UniqueName: \"kubernetes.io/projected/0d6ff443-99a7-45a7-8fae-ed6495357ab0-kube-api-access-rknfh\") pod \"glance-db-create-9mdcv\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.497099 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13dd2e31-d7bd-4984-bc91-480330e01ed1-operator-scripts\") pod \"glance-21bb-account-create-update-ks2wx\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.497439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzqqn\" (UniqueName: \"kubernetes.io/projected/13dd2e31-d7bd-4984-bc91-480330e01ed1-kube-api-access-hzqqn\") pod \"glance-21bb-account-create-update-ks2wx\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.498219 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13dd2e31-d7bd-4984-bc91-480330e01ed1-operator-scripts\") pod \"glance-21bb-account-create-update-ks2wx\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.515285 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzqqn\" (UniqueName: \"kubernetes.io/projected/13dd2e31-d7bd-4984-bc91-480330e01ed1-kube-api-access-hzqqn\") pod \"glance-21bb-account-create-update-ks2wx\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.617572 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.656005 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:56 crc kubenswrapper[5127]: I0201 08:42:56.921722 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9mdcv"] Feb 01 08:42:57 crc kubenswrapper[5127]: I0201 08:42:57.175279 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9mdcv" event={"ID":"0d6ff443-99a7-45a7-8fae-ed6495357ab0","Type":"ContainerStarted","Data":"375cff144ce619c5ff16b256717ea8a57770301acf6afd3c5ae93616e45b2645"} Feb 01 08:42:57 crc kubenswrapper[5127]: I0201 08:42:57.175659 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9mdcv" event={"ID":"0d6ff443-99a7-45a7-8fae-ed6495357ab0","Type":"ContainerStarted","Data":"92747e6726dd83f62c2f3490dd3ed8cc77894df2980139326655da7f013090ba"} Feb 01 08:42:57 crc kubenswrapper[5127]: I0201 08:42:57.196984 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-9mdcv" podStartSLOduration=1.196960244 podStartE2EDuration="1.196960244s" podCreationTimestamp="2026-02-01 08:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:42:57.194292633 +0000 UTC m=+6927.680195006" watchObservedRunningTime="2026-02-01 08:42:57.196960244 +0000 UTC m=+6927.682862607" Feb 01 08:42:57 crc kubenswrapper[5127]: I0201 08:42:57.217553 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-21bb-account-create-update-ks2wx"] Feb 01 08:42:57 crc kubenswrapper[5127]: W0201 08:42:57.219991 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13dd2e31_d7bd_4984_bc91_480330e01ed1.slice/crio-b23ed1ee70c6a3eb0ace9a0aadd38b82299256a18a2a3f2b34c04d334c642cde WatchSource:0}: Error finding container b23ed1ee70c6a3eb0ace9a0aadd38b82299256a18a2a3f2b34c04d334c642cde: Status 404 returned error can't find the container with id b23ed1ee70c6a3eb0ace9a0aadd38b82299256a18a2a3f2b34c04d334c642cde Feb 01 08:42:58 crc kubenswrapper[5127]: I0201 08:42:58.190886 5127 generic.go:334] "Generic (PLEG): container finished" podID="13dd2e31-d7bd-4984-bc91-480330e01ed1" containerID="5f367f456f379e5ac4964bcdb7cd6a1a64f4e09e5b731f3b2200c87b925894b5" exitCode=0 Feb 01 08:42:58 crc kubenswrapper[5127]: I0201 08:42:58.191732 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-21bb-account-create-update-ks2wx" event={"ID":"13dd2e31-d7bd-4984-bc91-480330e01ed1","Type":"ContainerDied","Data":"5f367f456f379e5ac4964bcdb7cd6a1a64f4e09e5b731f3b2200c87b925894b5"} Feb 01 08:42:58 crc kubenswrapper[5127]: I0201 08:42:58.192712 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-21bb-account-create-update-ks2wx" event={"ID":"13dd2e31-d7bd-4984-bc91-480330e01ed1","Type":"ContainerStarted","Data":"b23ed1ee70c6a3eb0ace9a0aadd38b82299256a18a2a3f2b34c04d334c642cde"} Feb 01 08:42:58 crc kubenswrapper[5127]: I0201 08:42:58.194561 5127 generic.go:334] "Generic (PLEG): container finished" podID="0d6ff443-99a7-45a7-8fae-ed6495357ab0" containerID="375cff144ce619c5ff16b256717ea8a57770301acf6afd3c5ae93616e45b2645" exitCode=0 Feb 01 08:42:58 crc kubenswrapper[5127]: I0201 08:42:58.194612 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9mdcv" event={"ID":"0d6ff443-99a7-45a7-8fae-ed6495357ab0","Type":"ContainerDied","Data":"375cff144ce619c5ff16b256717ea8a57770301acf6afd3c5ae93616e45b2645"} Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.649777 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.655084 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mdcv" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.802111 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rknfh\" (UniqueName: \"kubernetes.io/projected/0d6ff443-99a7-45a7-8fae-ed6495357ab0-kube-api-access-rknfh\") pod \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.802298 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13dd2e31-d7bd-4984-bc91-480330e01ed1-operator-scripts\") pod \"13dd2e31-d7bd-4984-bc91-480330e01ed1\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.802418 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzqqn\" (UniqueName: \"kubernetes.io/projected/13dd2e31-d7bd-4984-bc91-480330e01ed1-kube-api-access-hzqqn\") pod \"13dd2e31-d7bd-4984-bc91-480330e01ed1\" (UID: \"13dd2e31-d7bd-4984-bc91-480330e01ed1\") " Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.802473 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d6ff443-99a7-45a7-8fae-ed6495357ab0-operator-scripts\") pod \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\" (UID: \"0d6ff443-99a7-45a7-8fae-ed6495357ab0\") " Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.802946 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13dd2e31-d7bd-4984-bc91-480330e01ed1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13dd2e31-d7bd-4984-bc91-480330e01ed1" (UID: "13dd2e31-d7bd-4984-bc91-480330e01ed1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.803222 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d6ff443-99a7-45a7-8fae-ed6495357ab0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d6ff443-99a7-45a7-8fae-ed6495357ab0" (UID: "0d6ff443-99a7-45a7-8fae-ed6495357ab0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.809086 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d6ff443-99a7-45a7-8fae-ed6495357ab0-kube-api-access-rknfh" (OuterVolumeSpecName: "kube-api-access-rknfh") pod "0d6ff443-99a7-45a7-8fae-ed6495357ab0" (UID: "0d6ff443-99a7-45a7-8fae-ed6495357ab0"). InnerVolumeSpecName "kube-api-access-rknfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.809142 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dd2e31-d7bd-4984-bc91-480330e01ed1-kube-api-access-hzqqn" (OuterVolumeSpecName: "kube-api-access-hzqqn") pod "13dd2e31-d7bd-4984-bc91-480330e01ed1" (UID: "13dd2e31-d7bd-4984-bc91-480330e01ed1"). InnerVolumeSpecName "kube-api-access-hzqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.905053 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13dd2e31-d7bd-4984-bc91-480330e01ed1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.905099 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzqqn\" (UniqueName: \"kubernetes.io/projected/13dd2e31-d7bd-4984-bc91-480330e01ed1-kube-api-access-hzqqn\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.905115 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d6ff443-99a7-45a7-8fae-ed6495357ab0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:42:59 crc kubenswrapper[5127]: I0201 08:42:59.905127 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rknfh\" (UniqueName: \"kubernetes.io/projected/0d6ff443-99a7-45a7-8fae-ed6495357ab0-kube-api-access-rknfh\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:00 crc kubenswrapper[5127]: I0201 08:43:00.216191 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-21bb-account-create-update-ks2wx" Feb 01 08:43:00 crc kubenswrapper[5127]: I0201 08:43:00.216197 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-21bb-account-create-update-ks2wx" event={"ID":"13dd2e31-d7bd-4984-bc91-480330e01ed1","Type":"ContainerDied","Data":"b23ed1ee70c6a3eb0ace9a0aadd38b82299256a18a2a3f2b34c04d334c642cde"} Feb 01 08:43:00 crc kubenswrapper[5127]: I0201 08:43:00.216276 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b23ed1ee70c6a3eb0ace9a0aadd38b82299256a18a2a3f2b34c04d334c642cde" Feb 01 08:43:00 crc kubenswrapper[5127]: I0201 08:43:00.218087 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9mdcv" event={"ID":"0d6ff443-99a7-45a7-8fae-ed6495357ab0","Type":"ContainerDied","Data":"92747e6726dd83f62c2f3490dd3ed8cc77894df2980139326655da7f013090ba"} Feb 01 08:43:00 crc kubenswrapper[5127]: I0201 08:43:00.218131 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92747e6726dd83f62c2f3490dd3ed8cc77894df2980139326655da7f013090ba" Feb 01 08:43:00 crc kubenswrapper[5127]: I0201 08:43:00.218208 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9mdcv" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.534712 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zfmhw"] Feb 01 08:43:01 crc kubenswrapper[5127]: E0201 08:43:01.535623 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6ff443-99a7-45a7-8fae-ed6495357ab0" containerName="mariadb-database-create" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.535646 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6ff443-99a7-45a7-8fae-ed6495357ab0" containerName="mariadb-database-create" Feb 01 08:43:01 crc kubenswrapper[5127]: E0201 08:43:01.535689 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dd2e31-d7bd-4984-bc91-480330e01ed1" containerName="mariadb-account-create-update" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.535703 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dd2e31-d7bd-4984-bc91-480330e01ed1" containerName="mariadb-account-create-update" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.536000 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dd2e31-d7bd-4984-bc91-480330e01ed1" containerName="mariadb-account-create-update" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.536030 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6ff443-99a7-45a7-8fae-ed6495357ab0" containerName="mariadb-database-create" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.537067 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.540338 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.546177 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24zps" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.549165 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zfmhw"] Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.640354 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-config-data\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.640437 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-combined-ca-bundle\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.640549 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-db-sync-config-data\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.640715 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz9rh\" (UniqueName: \"kubernetes.io/projected/e1054ddd-1c98-4a90-a760-31b8044a68a3-kube-api-access-nz9rh\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.742310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz9rh\" (UniqueName: \"kubernetes.io/projected/e1054ddd-1c98-4a90-a760-31b8044a68a3-kube-api-access-nz9rh\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.742431 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-config-data\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.742475 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-combined-ca-bundle\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.742570 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-db-sync-config-data\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.749037 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-config-data\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.763930 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-combined-ca-bundle\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.765211 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz9rh\" (UniqueName: \"kubernetes.io/projected/e1054ddd-1c98-4a90-a760-31b8044a68a3-kube-api-access-nz9rh\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.767346 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-db-sync-config-data\") pod \"glance-db-sync-zfmhw\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:01 crc kubenswrapper[5127]: I0201 08:43:01.873835 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:02 crc kubenswrapper[5127]: I0201 08:43:02.473462 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zfmhw"] Feb 01 08:43:03 crc kubenswrapper[5127]: I0201 08:43:03.243476 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zfmhw" event={"ID":"e1054ddd-1c98-4a90-a760-31b8044a68a3","Type":"ContainerStarted","Data":"0924b7879ad38e09a340b18f5447b4641c31277a8b188fe7a0f7ca9d7e08e49e"} Feb 01 08:43:04 crc kubenswrapper[5127]: I0201 08:43:04.236102 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:43:04 crc kubenswrapper[5127]: E0201 08:43:04.236678 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:43:04 crc kubenswrapper[5127]: E0201 08:43:04.359916 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b71952a_c8b5_4778_8834_ffe4aa043fe1.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:43:14 crc kubenswrapper[5127]: E0201 08:43:14.613731 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b71952a_c8b5_4778_8834_ffe4aa043fe1.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:43:15 crc kubenswrapper[5127]: I0201 08:43:15.250019 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:43:15 crc kubenswrapper[5127]: E0201 08:43:15.250479 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:43:19 crc kubenswrapper[5127]: I0201 08:43:19.439280 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zfmhw" event={"ID":"e1054ddd-1c98-4a90-a760-31b8044a68a3","Type":"ContainerStarted","Data":"59b3fa772fcda5210fc68d8201644615fdf0ff3ec0ede819d2e89e94cc4ce57d"} Feb 01 08:43:19 crc kubenswrapper[5127]: I0201 08:43:19.468643 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zfmhw" podStartSLOduration=2.3065137399999998 podStartE2EDuration="18.468613503s" podCreationTimestamp="2026-02-01 08:43:01 +0000 UTC" firstStartedPulling="2026-02-01 08:43:02.483384554 +0000 UTC m=+6932.969286927" lastFinishedPulling="2026-02-01 08:43:18.645484327 +0000 UTC m=+6949.131386690" observedRunningTime="2026-02-01 08:43:19.455328545 +0000 UTC m=+6949.941230948" watchObservedRunningTime="2026-02-01 08:43:19.468613503 +0000 UTC m=+6949.954515906" Feb 01 08:43:22 crc kubenswrapper[5127]: I0201 08:43:22.465437 5127 generic.go:334] "Generic (PLEG): container finished" podID="e1054ddd-1c98-4a90-a760-31b8044a68a3" containerID="59b3fa772fcda5210fc68d8201644615fdf0ff3ec0ede819d2e89e94cc4ce57d" exitCode=0 Feb 01 08:43:22 crc kubenswrapper[5127]: I0201 08:43:22.465527 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zfmhw" event={"ID":"e1054ddd-1c98-4a90-a760-31b8044a68a3","Type":"ContainerDied","Data":"59b3fa772fcda5210fc68d8201644615fdf0ff3ec0ede819d2e89e94cc4ce57d"} Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.056246 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.108792 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-combined-ca-bundle\") pod \"e1054ddd-1c98-4a90-a760-31b8044a68a3\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.108863 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz9rh\" (UniqueName: \"kubernetes.io/projected/e1054ddd-1c98-4a90-a760-31b8044a68a3-kube-api-access-nz9rh\") pod \"e1054ddd-1c98-4a90-a760-31b8044a68a3\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.109036 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-config-data\") pod \"e1054ddd-1c98-4a90-a760-31b8044a68a3\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.109238 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-db-sync-config-data\") pod \"e1054ddd-1c98-4a90-a760-31b8044a68a3\" (UID: \"e1054ddd-1c98-4a90-a760-31b8044a68a3\") " Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.118913 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1054ddd-1c98-4a90-a760-31b8044a68a3-kube-api-access-nz9rh" (OuterVolumeSpecName: "kube-api-access-nz9rh") pod "e1054ddd-1c98-4a90-a760-31b8044a68a3" (UID: "e1054ddd-1c98-4a90-a760-31b8044a68a3"). InnerVolumeSpecName "kube-api-access-nz9rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.119025 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e1054ddd-1c98-4a90-a760-31b8044a68a3" (UID: "e1054ddd-1c98-4a90-a760-31b8044a68a3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.133886 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1054ddd-1c98-4a90-a760-31b8044a68a3" (UID: "e1054ddd-1c98-4a90-a760-31b8044a68a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.161809 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-config-data" (OuterVolumeSpecName: "config-data") pod "e1054ddd-1c98-4a90-a760-31b8044a68a3" (UID: "e1054ddd-1c98-4a90-a760-31b8044a68a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.211413 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.211456 5127 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.211467 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1054ddd-1c98-4a90-a760-31b8044a68a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.211477 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz9rh\" (UniqueName: \"kubernetes.io/projected/e1054ddd-1c98-4a90-a760-31b8044a68a3-kube-api-access-nz9rh\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.485832 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zfmhw" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.485832 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zfmhw" event={"ID":"e1054ddd-1c98-4a90-a760-31b8044a68a3","Type":"ContainerDied","Data":"0924b7879ad38e09a340b18f5447b4641c31277a8b188fe7a0f7ca9d7e08e49e"} Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.486326 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0924b7879ad38e09a340b18f5447b4641c31277a8b188fe7a0f7ca9d7e08e49e" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.834845 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:24 crc kubenswrapper[5127]: E0201 08:43:24.835723 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1054ddd-1c98-4a90-a760-31b8044a68a3" containerName="glance-db-sync" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.835740 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1054ddd-1c98-4a90-a760-31b8044a68a3" containerName="glance-db-sync" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.837069 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1054ddd-1c98-4a90-a760-31b8044a68a3" containerName="glance-db-sync" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.839560 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.857242 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.857527 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.862654 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.868607 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-24zps" Feb 01 08:43:24 crc kubenswrapper[5127]: E0201 08:43:24.877316 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b71952a_c8b5_4778_8834_ffe4aa043fe1.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.935759 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.973502 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68586cb4b7-l6qxh"] Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.976461 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.987350 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.988884 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.992869 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 08:43:24 crc kubenswrapper[5127]: I0201 08:43:24.998500 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68586cb4b7-l6qxh"] Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.030589 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.031409 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.031466 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.031522 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.031543 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.035636 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-ceph\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.035687 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7cp\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-kube-api-access-lb7cp\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.035751 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-logs\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137118 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-logs\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137209 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-dns-svc\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137245 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgszm\" (UniqueName: \"kubernetes.io/projected/c35f18bb-68ff-4460-b5a6-848c47fc7495-kube-api-access-cgszm\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137270 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137303 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137335 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137385 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2wvd\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-kube-api-access-v2wvd\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137425 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137471 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137550 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137621 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137656 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137679 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-sb\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137738 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-nb\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137771 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-logs\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137794 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-ceph\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137839 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-config\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137870 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7cp\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-kube-api-access-lb7cp\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.137891 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-logs\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.138002 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.139946 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.142070 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-ceph\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.142551 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.142609 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.143385 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.159493 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7cp\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-kube-api-access-lb7cp\") pod \"glance-default-external-api-0\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.178798 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240142 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-config\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240201 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-logs\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240237 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240320 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-dns-svc\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240347 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgszm\" (UniqueName: \"kubernetes.io/projected/c35f18bb-68ff-4460-b5a6-848c47fc7495-kube-api-access-cgszm\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240393 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240427 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wvd\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-kube-api-access-v2wvd\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240458 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240512 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240536 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240559 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-sb\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.240617 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-nb\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.241287 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-logs\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.241826 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-sb\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.243764 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-config\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.249296 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.249510 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.252064 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.258770 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.261737 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgszm\" (UniqueName: \"kubernetes.io/projected/c35f18bb-68ff-4460-b5a6-848c47fc7495-kube-api-access-cgszm\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.262389 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2wvd\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-kube-api-access-v2wvd\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.262725 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.263537 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-nb\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.263692 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-dns-svc\") pod \"dnsmasq-dns-68586cb4b7-l6qxh\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.304798 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.348780 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.716143 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:25 crc kubenswrapper[5127]: I0201 08:43:25.844552 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68586cb4b7-l6qxh"] Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.090408 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.102626 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.242297 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:43:26 crc kubenswrapper[5127]: E0201 08:43:26.242681 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.508055 5127 generic.go:334] "Generic (PLEG): container finished" podID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerID="53fd0def7484ba67a38abe6568e41841f0f81ebfb4c6f49ec5fa40a2553d0053" exitCode=0 Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.508136 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" event={"ID":"c35f18bb-68ff-4460-b5a6-848c47fc7495","Type":"ContainerDied","Data":"53fd0def7484ba67a38abe6568e41841f0f81ebfb4c6f49ec5fa40a2553d0053"} Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.508329 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" event={"ID":"c35f18bb-68ff-4460-b5a6-848c47fc7495","Type":"ContainerStarted","Data":"3ceda4b59d567870cd3f764f901dd241a5f35de012422db81150e194c1e453af"} Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.520114 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bb83a89-9068-42e8-908f-1a3aaef236a5","Type":"ContainerStarted","Data":"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f"} Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.520157 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bb83a89-9068-42e8-908f-1a3aaef236a5","Type":"ContainerStarted","Data":"c57d68323e9dc1167ecc0a7a65ff5f4ec520e89d3d3f5604da53ec69323994b4"} Feb 01 08:43:26 crc kubenswrapper[5127]: I0201 08:43:26.523613 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f193a586-4ae3-4682-b0d8-383612e65d04","Type":"ContainerStarted","Data":"b4df3d7c9ccd804258e078a272fb0f436003ad083b16b8ba8af2663fbe9bcde8"} Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.535450 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f193a586-4ae3-4682-b0d8-383612e65d04","Type":"ContainerStarted","Data":"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f"} Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.536412 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f193a586-4ae3-4682-b0d8-383612e65d04","Type":"ContainerStarted","Data":"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7"} Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.538928 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" event={"ID":"c35f18bb-68ff-4460-b5a6-848c47fc7495","Type":"ContainerStarted","Data":"3d76621a2da79b532e43a5d3da4c6c029e842523aed07a14d7fe4f74068959b9"} Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.539066 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.540691 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bb83a89-9068-42e8-908f-1a3aaef236a5","Type":"ContainerStarted","Data":"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071"} Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.540824 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-log" containerID="cri-o://6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f" gracePeriod=30 Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.540855 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-httpd" containerID="cri-o://da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071" gracePeriod=30 Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.582303 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.582275847 podStartE2EDuration="3.582275847s" podCreationTimestamp="2026-02-01 08:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:43:27.569829753 +0000 UTC m=+6958.055732136" watchObservedRunningTime="2026-02-01 08:43:27.582275847 +0000 UTC m=+6958.068178230" Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.636799 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" podStartSLOduration=3.636781293 podStartE2EDuration="3.636781293s" podCreationTimestamp="2026-02-01 08:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:43:27.629261301 +0000 UTC m=+6958.115163684" watchObservedRunningTime="2026-02-01 08:43:27.636781293 +0000 UTC m=+6958.122683656" Feb 01 08:43:27 crc kubenswrapper[5127]: I0201 08:43:27.637772 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.63776581 podStartE2EDuration="3.63776581s" podCreationTimestamp="2026-02-01 08:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:43:27.61065478 +0000 UTC m=+6958.096557153" watchObservedRunningTime="2026-02-01 08:43:27.63776581 +0000 UTC m=+6958.123668173" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.044109 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.178487 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.203879 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-logs\") pod \"2bb83a89-9068-42e8-908f-1a3aaef236a5\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.204001 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-ceph\") pod \"2bb83a89-9068-42e8-908f-1a3aaef236a5\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.204035 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-httpd-run\") pod \"2bb83a89-9068-42e8-908f-1a3aaef236a5\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.204114 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-config-data\") pod \"2bb83a89-9068-42e8-908f-1a3aaef236a5\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.204218 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-combined-ca-bundle\") pod \"2bb83a89-9068-42e8-908f-1a3aaef236a5\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.204298 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-scripts\") pod \"2bb83a89-9068-42e8-908f-1a3aaef236a5\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.204361 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb7cp\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-kube-api-access-lb7cp\") pod \"2bb83a89-9068-42e8-908f-1a3aaef236a5\" (UID: \"2bb83a89-9068-42e8-908f-1a3aaef236a5\") " Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.204648 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2bb83a89-9068-42e8-908f-1a3aaef236a5" (UID: "2bb83a89-9068-42e8-908f-1a3aaef236a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.205157 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-logs" (OuterVolumeSpecName: "logs") pod "2bb83a89-9068-42e8-908f-1a3aaef236a5" (UID: "2bb83a89-9068-42e8-908f-1a3aaef236a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.205421 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.205453 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2bb83a89-9068-42e8-908f-1a3aaef236a5-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.209861 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-scripts" (OuterVolumeSpecName: "scripts") pod "2bb83a89-9068-42e8-908f-1a3aaef236a5" (UID: "2bb83a89-9068-42e8-908f-1a3aaef236a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.221506 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-ceph" (OuterVolumeSpecName: "ceph") pod "2bb83a89-9068-42e8-908f-1a3aaef236a5" (UID: "2bb83a89-9068-42e8-908f-1a3aaef236a5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.221671 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-kube-api-access-lb7cp" (OuterVolumeSpecName: "kube-api-access-lb7cp") pod "2bb83a89-9068-42e8-908f-1a3aaef236a5" (UID: "2bb83a89-9068-42e8-908f-1a3aaef236a5"). InnerVolumeSpecName "kube-api-access-lb7cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.246400 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bb83a89-9068-42e8-908f-1a3aaef236a5" (UID: "2bb83a89-9068-42e8-908f-1a3aaef236a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.257337 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-config-data" (OuterVolumeSpecName: "config-data") pod "2bb83a89-9068-42e8-908f-1a3aaef236a5" (UID: "2bb83a89-9068-42e8-908f-1a3aaef236a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.307449 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.307490 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.307505 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb7cp\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-kube-api-access-lb7cp\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.307519 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2bb83a89-9068-42e8-908f-1a3aaef236a5-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.307534 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bb83a89-9068-42e8-908f-1a3aaef236a5-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.550720 5127 generic.go:334] "Generic (PLEG): container finished" podID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerID="da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071" exitCode=0 Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.550761 5127 generic.go:334] "Generic (PLEG): container finished" podID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerID="6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f" exitCode=143 Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.551474 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.551908 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bb83a89-9068-42e8-908f-1a3aaef236a5","Type":"ContainerDied","Data":"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071"} Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.551934 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bb83a89-9068-42e8-908f-1a3aaef236a5","Type":"ContainerDied","Data":"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f"} Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.551945 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2bb83a89-9068-42e8-908f-1a3aaef236a5","Type":"ContainerDied","Data":"c57d68323e9dc1167ecc0a7a65ff5f4ec520e89d3d3f5604da53ec69323994b4"} Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.551959 5127 scope.go:117] "RemoveContainer" containerID="da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.574424 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.582227 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.583068 5127 scope.go:117] "RemoveContainer" containerID="6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.607905 5127 scope.go:117] "RemoveContainer" containerID="da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071" Feb 01 08:43:28 crc kubenswrapper[5127]: E0201 08:43:28.608433 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071\": container with ID starting with da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071 not found: ID does not exist" containerID="da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.608481 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071"} err="failed to get container status \"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071\": rpc error: code = NotFound desc = could not find container \"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071\": container with ID starting with da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071 not found: ID does not exist" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.608514 5127 scope.go:117] "RemoveContainer" containerID="6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f" Feb 01 08:43:28 crc kubenswrapper[5127]: E0201 08:43:28.609060 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f\": container with ID starting with 6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f not found: ID does not exist" containerID="6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.609096 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f"} err="failed to get container status \"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f\": rpc error: code = NotFound desc = could not find container \"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f\": container with ID starting with 6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f not found: ID does not exist" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.609115 5127 scope.go:117] "RemoveContainer" containerID="da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.609653 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071"} err="failed to get container status \"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071\": rpc error: code = NotFound desc = could not find container \"da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071\": container with ID starting with da0cbc07c43b19de1ac7e9127045b3696f06daa938e7ebcc4c370743d4ca4071 not found: ID does not exist" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.609695 5127 scope.go:117] "RemoveContainer" containerID="6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.609849 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.610043 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f"} err="failed to get container status \"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f\": rpc error: code = NotFound desc = could not find container \"6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f\": container with ID starting with 6ae9710824fc904c7bc026d055435433083c37e80f48212e705f52059ecf700f not found: ID does not exist" Feb 01 08:43:28 crc kubenswrapper[5127]: E0201 08:43:28.610479 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-log" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.610504 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-log" Feb 01 08:43:28 crc kubenswrapper[5127]: E0201 08:43:28.610531 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-httpd" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.610540 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-httpd" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.610781 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-log" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.610808 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" containerName="glance-httpd" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.615801 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.621435 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.621920 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.716681 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-ceph\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.716742 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.717130 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-logs\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.717221 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtg87\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-kube-api-access-rtg87\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.717322 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.717364 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.717630 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.819788 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-ceph\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.819835 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.819906 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-logs\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.819933 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtg87\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-kube-api-access-rtg87\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.819958 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.819977 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.820017 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.820532 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-logs\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.820696 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.824902 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-scripts\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.825077 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-ceph\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.828495 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-config-data\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.836029 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtg87\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-kube-api-access-rtg87\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.836163 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " pod="openstack/glance-default-external-api-0" Feb 01 08:43:28 crc kubenswrapper[5127]: I0201 08:43:28.938479 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:43:29 crc kubenswrapper[5127]: I0201 08:43:29.773300 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-log" containerID="cri-o://05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7" gracePeriod=30 Feb 01 08:43:29 crc kubenswrapper[5127]: I0201 08:43:29.773369 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-httpd" containerID="cri-o://36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f" gracePeriod=30 Feb 01 08:43:29 crc kubenswrapper[5127]: I0201 08:43:29.815749 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.251207 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bb83a89-9068-42e8-908f-1a3aaef236a5" path="/var/lib/kubelet/pods/2bb83a89-9068-42e8-908f-1a3aaef236a5/volumes" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.481020 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.660781 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2wvd\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-kube-api-access-v2wvd\") pod \"f193a586-4ae3-4682-b0d8-383612e65d04\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.660852 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-combined-ca-bundle\") pod \"f193a586-4ae3-4682-b0d8-383612e65d04\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.660877 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-ceph\") pod \"f193a586-4ae3-4682-b0d8-383612e65d04\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.660973 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-scripts\") pod \"f193a586-4ae3-4682-b0d8-383612e65d04\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.661047 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-config-data\") pod \"f193a586-4ae3-4682-b0d8-383612e65d04\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.661083 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-logs\") pod \"f193a586-4ae3-4682-b0d8-383612e65d04\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.661118 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-httpd-run\") pod \"f193a586-4ae3-4682-b0d8-383612e65d04\" (UID: \"f193a586-4ae3-4682-b0d8-383612e65d04\") " Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.662187 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f193a586-4ae3-4682-b0d8-383612e65d04" (UID: "f193a586-4ae3-4682-b0d8-383612e65d04"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.665376 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-logs" (OuterVolumeSpecName: "logs") pod "f193a586-4ae3-4682-b0d8-383612e65d04" (UID: "f193a586-4ae3-4682-b0d8-383612e65d04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.665652 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-kube-api-access-v2wvd" (OuterVolumeSpecName: "kube-api-access-v2wvd") pod "f193a586-4ae3-4682-b0d8-383612e65d04" (UID: "f193a586-4ae3-4682-b0d8-383612e65d04"). InnerVolumeSpecName "kube-api-access-v2wvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.665766 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-scripts" (OuterVolumeSpecName: "scripts") pod "f193a586-4ae3-4682-b0d8-383612e65d04" (UID: "f193a586-4ae3-4682-b0d8-383612e65d04"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.668548 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-ceph" (OuterVolumeSpecName: "ceph") pod "f193a586-4ae3-4682-b0d8-383612e65d04" (UID: "f193a586-4ae3-4682-b0d8-383612e65d04"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.697268 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f193a586-4ae3-4682-b0d8-383612e65d04" (UID: "f193a586-4ae3-4682-b0d8-383612e65d04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.716453 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-config-data" (OuterVolumeSpecName: "config-data") pod "f193a586-4ae3-4682-b0d8-383612e65d04" (UID: "f193a586-4ae3-4682-b0d8-383612e65d04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.763158 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.763197 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.763206 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f193a586-4ae3-4682-b0d8-383612e65d04-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.763215 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2wvd\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-kube-api-access-v2wvd\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.763226 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.763249 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f193a586-4ae3-4682-b0d8-383612e65d04-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.763256 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f193a586-4ae3-4682-b0d8-383612e65d04-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.782449 5127 generic.go:334] "Generic (PLEG): container finished" podID="f193a586-4ae3-4682-b0d8-383612e65d04" containerID="36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f" exitCode=0 Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.782479 5127 generic.go:334] "Generic (PLEG): container finished" podID="f193a586-4ae3-4682-b0d8-383612e65d04" containerID="05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7" exitCode=143 Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.782523 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.782531 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f193a586-4ae3-4682-b0d8-383612e65d04","Type":"ContainerDied","Data":"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f"} Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.782561 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f193a586-4ae3-4682-b0d8-383612e65d04","Type":"ContainerDied","Data":"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7"} Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.782573 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f193a586-4ae3-4682-b0d8-383612e65d04","Type":"ContainerDied","Data":"b4df3d7c9ccd804258e078a272fb0f436003ad083b16b8ba8af2663fbe9bcde8"} Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.782616 5127 scope.go:117] "RemoveContainer" containerID="36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.788173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e63579e-7087-46b0-b5ee-ea62558b2b58","Type":"ContainerStarted","Data":"24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068"} Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.788205 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e63579e-7087-46b0-b5ee-ea62558b2b58","Type":"ContainerStarted","Data":"58c1716a09701414054daf5a04cdade9eb6b230f3127d5a0cf93ed3378126ac8"} Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.807918 5127 scope.go:117] "RemoveContainer" containerID="05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.852741 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.856453 5127 scope.go:117] "RemoveContainer" containerID="36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f" Feb 01 08:43:30 crc kubenswrapper[5127]: E0201 08:43:30.857783 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f\": container with ID starting with 36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f not found: ID does not exist" containerID="36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.857830 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f"} err="failed to get container status \"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f\": rpc error: code = NotFound desc = could not find container \"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f\": container with ID starting with 36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f not found: ID does not exist" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.857857 5127 scope.go:117] "RemoveContainer" containerID="05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7" Feb 01 08:43:30 crc kubenswrapper[5127]: E0201 08:43:30.858326 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7\": container with ID starting with 05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7 not found: ID does not exist" containerID="05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.858381 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7"} err="failed to get container status \"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7\": rpc error: code = NotFound desc = could not find container \"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7\": container with ID starting with 05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7 not found: ID does not exist" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.858404 5127 scope.go:117] "RemoveContainer" containerID="36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.858632 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f"} err="failed to get container status \"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f\": rpc error: code = NotFound desc = could not find container \"36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f\": container with ID starting with 36993bfce1561e3bcd4c364691b5149be6d8b94299f315c04be93309aa7f7c0f not found: ID does not exist" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.858652 5127 scope.go:117] "RemoveContainer" containerID="05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.858934 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7"} err="failed to get container status \"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7\": rpc error: code = NotFound desc = could not find container \"05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7\": container with ID starting with 05960fb2d1fb60b829a5559105459b42acf55b66f4deb9da6d185f4e2e0d54e7 not found: ID does not exist" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.867679 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.876455 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:30 crc kubenswrapper[5127]: E0201 08:43:30.877508 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-httpd" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.877615 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-httpd" Feb 01 08:43:30 crc kubenswrapper[5127]: E0201 08:43:30.877688 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-log" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.877752 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-log" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.877963 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-log" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.878035 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" containerName="glance-httpd" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.879028 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.885739 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 08:43:30 crc kubenswrapper[5127]: I0201 08:43:30.887862 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.069285 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.069753 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.069827 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.069951 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.070250 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.070402 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmfg\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-kube-api-access-dnmfg\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.070784 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.172721 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.173456 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.173521 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmfg\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-kube-api-access-dnmfg\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.173559 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.173597 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.173617 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.173696 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.174195 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.174291 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-logs\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.176535 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.176982 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-ceph\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.177355 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.182119 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.195226 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmfg\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-kube-api-access-dnmfg\") pod \"glance-default-internal-api-0\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.212800 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.725756 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.803280 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e63579e-7087-46b0-b5ee-ea62558b2b58","Type":"ContainerStarted","Data":"d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132"} Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.811719 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d3db2ee-c313-45b4-b9e7-a043a619a101","Type":"ContainerStarted","Data":"7a662a992c1f6d52360885fd24c5948cfb1a244b7019b4399f470cf93d009b71"} Feb 01 08:43:31 crc kubenswrapper[5127]: I0201 08:43:31.838490 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.838461977 podStartE2EDuration="3.838461977s" podCreationTimestamp="2026-02-01 08:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:43:31.82514746 +0000 UTC m=+6962.311049833" watchObservedRunningTime="2026-02-01 08:43:31.838461977 +0000 UTC m=+6962.324364360" Feb 01 08:43:32 crc kubenswrapper[5127]: I0201 08:43:32.256636 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f193a586-4ae3-4682-b0d8-383612e65d04" path="/var/lib/kubelet/pods/f193a586-4ae3-4682-b0d8-383612e65d04/volumes" Feb 01 08:43:32 crc kubenswrapper[5127]: I0201 08:43:32.824665 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d3db2ee-c313-45b4-b9e7-a043a619a101","Type":"ContainerStarted","Data":"576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601"} Feb 01 08:43:33 crc kubenswrapper[5127]: I0201 08:43:33.836729 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d3db2ee-c313-45b4-b9e7-a043a619a101","Type":"ContainerStarted","Data":"a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee"} Feb 01 08:43:33 crc kubenswrapper[5127]: I0201 08:43:33.872614 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.872551973 podStartE2EDuration="3.872551973s" podCreationTimestamp="2026-02-01 08:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:43:33.86312403 +0000 UTC m=+6964.349026453" watchObservedRunningTime="2026-02-01 08:43:33.872551973 +0000 UTC m=+6964.358454366" Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.306777 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.409343 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5746cb9b6f-hrzzc"] Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.409620 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" podUID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerName="dnsmasq-dns" containerID="cri-o://f547792f9611b04d30f30dd56e3f2c7128e1738f7ea4294ca325640d5ced28e0" gracePeriod=10 Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.858003 5127 generic.go:334] "Generic (PLEG): container finished" podID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerID="f547792f9611b04d30f30dd56e3f2c7128e1738f7ea4294ca325640d5ced28e0" exitCode=0 Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.858141 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" event={"ID":"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7","Type":"ContainerDied","Data":"f547792f9611b04d30f30dd56e3f2c7128e1738f7ea4294ca325640d5ced28e0"} Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.858254 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" event={"ID":"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7","Type":"ContainerDied","Data":"ab2bac67ed32f2188ec9cbf8f4872657d2900ed4e517d3c5df8f90cb530f23db"} Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.858268 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab2bac67ed32f2188ec9cbf8f4872657d2900ed4e517d3c5df8f90cb530f23db" Feb 01 08:43:35 crc kubenswrapper[5127]: I0201 08:43:35.909503 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.078441 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-config\") pod \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.078490 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-nb\") pod \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.078631 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq44g\" (UniqueName: \"kubernetes.io/projected/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-kube-api-access-mq44g\") pod \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.078717 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-sb\") pod \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.078797 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-dns-svc\") pod \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\" (UID: \"b41aa7ab-59fa-4b38-9927-a7c4d936cfa7\") " Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.085256 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-kube-api-access-mq44g" (OuterVolumeSpecName: "kube-api-access-mq44g") pod "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" (UID: "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7"). InnerVolumeSpecName "kube-api-access-mq44g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.122172 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" (UID: "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.124280 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" (UID: "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.129132 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" (UID: "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.135732 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-config" (OuterVolumeSpecName: "config") pod "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" (UID: "b41aa7ab-59fa-4b38-9927-a7c4d936cfa7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.181354 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq44g\" (UniqueName: \"kubernetes.io/projected/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-kube-api-access-mq44g\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.181413 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.181427 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.181440 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.181481 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.874868 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5746cb9b6f-hrzzc" Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.925514 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5746cb9b6f-hrzzc"] Feb 01 08:43:36 crc kubenswrapper[5127]: I0201 08:43:36.943284 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5746cb9b6f-hrzzc"] Feb 01 08:43:37 crc kubenswrapper[5127]: I0201 08:43:37.236770 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:43:37 crc kubenswrapper[5127]: E0201 08:43:37.237115 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:43:38 crc kubenswrapper[5127]: I0201 08:43:38.268701 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" path="/var/lib/kubelet/pods/b41aa7ab-59fa-4b38-9927-a7c4d936cfa7/volumes" Feb 01 08:43:38 crc kubenswrapper[5127]: I0201 08:43:38.939502 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 08:43:38 crc kubenswrapper[5127]: I0201 08:43:38.939905 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 08:43:38 crc kubenswrapper[5127]: I0201 08:43:38.994769 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 08:43:39 crc kubenswrapper[5127]: I0201 08:43:39.002888 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 08:43:39 crc kubenswrapper[5127]: I0201 08:43:39.908284 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 08:43:39 crc kubenswrapper[5127]: I0201 08:43:39.908348 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 08:43:41 crc kubenswrapper[5127]: I0201 08:43:41.213487 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:41 crc kubenswrapper[5127]: I0201 08:43:41.213988 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:41 crc kubenswrapper[5127]: I0201 08:43:41.259575 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:41 crc kubenswrapper[5127]: I0201 08:43:41.298267 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:41 crc kubenswrapper[5127]: I0201 08:43:41.933195 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:41 crc kubenswrapper[5127]: I0201 08:43:41.933251 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:42 crc kubenswrapper[5127]: I0201 08:43:42.025085 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 08:43:42 crc kubenswrapper[5127]: I0201 08:43:42.025213 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 08:43:42 crc kubenswrapper[5127]: I0201 08:43:42.083970 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 08:43:42 crc kubenswrapper[5127]: I0201 08:43:42.878092 5127 scope.go:117] "RemoveContainer" containerID="69f88a5ce6591eafa4c49d26cc5b2b4edbe4ca8c6c30ecb8a759a81dc7795ffb" Feb 01 08:43:42 crc kubenswrapper[5127]: I0201 08:43:42.925728 5127 scope.go:117] "RemoveContainer" containerID="eae1d109b609b9ec9050a8b2e4c3c17c740c313155b20685463e47b4c9bbe08f" Feb 01 08:43:43 crc kubenswrapper[5127]: I0201 08:43:43.886835 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:43 crc kubenswrapper[5127]: I0201 08:43:43.927088 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.019536 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-lhrxg"] Feb 01 08:43:50 crc kubenswrapper[5127]: E0201 08:43:50.022930 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerName="init" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.022952 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerName="init" Feb 01 08:43:50 crc kubenswrapper[5127]: E0201 08:43:50.022964 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerName="dnsmasq-dns" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.022970 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerName="dnsmasq-dns" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.023138 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b41aa7ab-59fa-4b38-9927-a7c4d936cfa7" containerName="dnsmasq-dns" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.023857 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.029540 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lhrxg"] Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.129310 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a754-account-create-update-4tbqd"] Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.130982 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.134346 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.142252 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a754-account-create-update-4tbqd"] Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.166537 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289df401-c7a9-4098-95d7-94dd5affe406-operator-scripts\") pod \"placement-db-create-lhrxg\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.166615 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btqb2\" (UniqueName: \"kubernetes.io/projected/289df401-c7a9-4098-95d7-94dd5affe406-kube-api-access-btqb2\") pod \"placement-db-create-lhrxg\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.241191 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:43:50 crc kubenswrapper[5127]: E0201 08:43:50.241460 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.269133 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8ktz\" (UniqueName: \"kubernetes.io/projected/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-kube-api-access-d8ktz\") pod \"placement-a754-account-create-update-4tbqd\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.269434 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289df401-c7a9-4098-95d7-94dd5affe406-operator-scripts\") pod \"placement-db-create-lhrxg\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.269472 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btqb2\" (UniqueName: \"kubernetes.io/projected/289df401-c7a9-4098-95d7-94dd5affe406-kube-api-access-btqb2\") pod \"placement-db-create-lhrxg\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.269565 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-operator-scripts\") pod \"placement-a754-account-create-update-4tbqd\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.270395 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289df401-c7a9-4098-95d7-94dd5affe406-operator-scripts\") pod \"placement-db-create-lhrxg\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.303164 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btqb2\" (UniqueName: \"kubernetes.io/projected/289df401-c7a9-4098-95d7-94dd5affe406-kube-api-access-btqb2\") pod \"placement-db-create-lhrxg\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.355997 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.371381 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8ktz\" (UniqueName: \"kubernetes.io/projected/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-kube-api-access-d8ktz\") pod \"placement-a754-account-create-update-4tbqd\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.371959 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-operator-scripts\") pod \"placement-a754-account-create-update-4tbqd\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.372993 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-operator-scripts\") pod \"placement-a754-account-create-update-4tbqd\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.399767 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8ktz\" (UniqueName: \"kubernetes.io/projected/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-kube-api-access-d8ktz\") pod \"placement-a754-account-create-update-4tbqd\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:50 crc kubenswrapper[5127]: I0201 08:43:50.458574 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:51 crc kubenswrapper[5127]: I0201 08:43:50.933459 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-lhrxg"] Feb 01 08:43:51 crc kubenswrapper[5127]: I0201 08:43:51.030260 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lhrxg" event={"ID":"289df401-c7a9-4098-95d7-94dd5affe406","Type":"ContainerStarted","Data":"7227d65d38e275f0554c25b540bdfd200ee22877a966ea22f45e7466b80e6e0f"} Feb 01 08:43:51 crc kubenswrapper[5127]: I0201 08:43:51.567048 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a754-account-create-update-4tbqd"] Feb 01 08:43:51 crc kubenswrapper[5127]: W0201 08:43:51.573670 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaecbfca9_cc14_4c56_8205_84f9e4f96b6e.slice/crio-9aa93c3866379ff00f698a84e9c6dd7557205f948bd4bec0b83cecaed809fb41 WatchSource:0}: Error finding container 9aa93c3866379ff00f698a84e9c6dd7557205f948bd4bec0b83cecaed809fb41: Status 404 returned error can't find the container with id 9aa93c3866379ff00f698a84e9c6dd7557205f948bd4bec0b83cecaed809fb41 Feb 01 08:43:52 crc kubenswrapper[5127]: I0201 08:43:52.039939 5127 generic.go:334] "Generic (PLEG): container finished" podID="289df401-c7a9-4098-95d7-94dd5affe406" containerID="38f8d0d67d6df968d5c9947e6389b73c5bf01d304792008815da09ba7bd2d0fb" exitCode=0 Feb 01 08:43:52 crc kubenswrapper[5127]: I0201 08:43:52.040275 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lhrxg" event={"ID":"289df401-c7a9-4098-95d7-94dd5affe406","Type":"ContainerDied","Data":"38f8d0d67d6df968d5c9947e6389b73c5bf01d304792008815da09ba7bd2d0fb"} Feb 01 08:43:52 crc kubenswrapper[5127]: I0201 08:43:52.042360 5127 generic.go:334] "Generic (PLEG): container finished" podID="aecbfca9-cc14-4c56-8205-84f9e4f96b6e" containerID="7f147dda7528bb3dc8be11d16730d307b338c9a8f59c256dc7410d4e7b5ff399" exitCode=0 Feb 01 08:43:52 crc kubenswrapper[5127]: I0201 08:43:52.042391 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a754-account-create-update-4tbqd" event={"ID":"aecbfca9-cc14-4c56-8205-84f9e4f96b6e","Type":"ContainerDied","Data":"7f147dda7528bb3dc8be11d16730d307b338c9a8f59c256dc7410d4e7b5ff399"} Feb 01 08:43:52 crc kubenswrapper[5127]: I0201 08:43:52.042405 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a754-account-create-update-4tbqd" event={"ID":"aecbfca9-cc14-4c56-8205-84f9e4f96b6e","Type":"ContainerStarted","Data":"9aa93c3866379ff00f698a84e9c6dd7557205f948bd4bec0b83cecaed809fb41"} Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.572823 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.583713 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.734680 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-operator-scripts\") pod \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.734722 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289df401-c7a9-4098-95d7-94dd5affe406-operator-scripts\") pod \"289df401-c7a9-4098-95d7-94dd5affe406\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.734775 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btqb2\" (UniqueName: \"kubernetes.io/projected/289df401-c7a9-4098-95d7-94dd5affe406-kube-api-access-btqb2\") pod \"289df401-c7a9-4098-95d7-94dd5affe406\" (UID: \"289df401-c7a9-4098-95d7-94dd5affe406\") " Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.734859 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8ktz\" (UniqueName: \"kubernetes.io/projected/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-kube-api-access-d8ktz\") pod \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\" (UID: \"aecbfca9-cc14-4c56-8205-84f9e4f96b6e\") " Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.736079 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aecbfca9-cc14-4c56-8205-84f9e4f96b6e" (UID: "aecbfca9-cc14-4c56-8205-84f9e4f96b6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.736153 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289df401-c7a9-4098-95d7-94dd5affe406-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "289df401-c7a9-4098-95d7-94dd5affe406" (UID: "289df401-c7a9-4098-95d7-94dd5affe406"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.736980 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.737021 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/289df401-c7a9-4098-95d7-94dd5affe406-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.742217 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-kube-api-access-d8ktz" (OuterVolumeSpecName: "kube-api-access-d8ktz") pod "aecbfca9-cc14-4c56-8205-84f9e4f96b6e" (UID: "aecbfca9-cc14-4c56-8205-84f9e4f96b6e"). InnerVolumeSpecName "kube-api-access-d8ktz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.742357 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289df401-c7a9-4098-95d7-94dd5affe406-kube-api-access-btqb2" (OuterVolumeSpecName: "kube-api-access-btqb2") pod "289df401-c7a9-4098-95d7-94dd5affe406" (UID: "289df401-c7a9-4098-95d7-94dd5affe406"). InnerVolumeSpecName "kube-api-access-btqb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.839357 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btqb2\" (UniqueName: \"kubernetes.io/projected/289df401-c7a9-4098-95d7-94dd5affe406-kube-api-access-btqb2\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:53 crc kubenswrapper[5127]: I0201 08:43:53.839429 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8ktz\" (UniqueName: \"kubernetes.io/projected/aecbfca9-cc14-4c56-8205-84f9e4f96b6e-kube-api-access-d8ktz\") on node \"crc\" DevicePath \"\"" Feb 01 08:43:54 crc kubenswrapper[5127]: I0201 08:43:54.065174 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-lhrxg" event={"ID":"289df401-c7a9-4098-95d7-94dd5affe406","Type":"ContainerDied","Data":"7227d65d38e275f0554c25b540bdfd200ee22877a966ea22f45e7466b80e6e0f"} Feb 01 08:43:54 crc kubenswrapper[5127]: I0201 08:43:54.065542 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7227d65d38e275f0554c25b540bdfd200ee22877a966ea22f45e7466b80e6e0f" Feb 01 08:43:54 crc kubenswrapper[5127]: I0201 08:43:54.065212 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-lhrxg" Feb 01 08:43:54 crc kubenswrapper[5127]: I0201 08:43:54.067787 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a754-account-create-update-4tbqd" event={"ID":"aecbfca9-cc14-4c56-8205-84f9e4f96b6e","Type":"ContainerDied","Data":"9aa93c3866379ff00f698a84e9c6dd7557205f948bd4bec0b83cecaed809fb41"} Feb 01 08:43:54 crc kubenswrapper[5127]: I0201 08:43:54.067838 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aa93c3866379ff00f698a84e9c6dd7557205f948bd4bec0b83cecaed809fb41" Feb 01 08:43:54 crc kubenswrapper[5127]: I0201 08:43:54.067901 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a754-account-create-update-4tbqd" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.392449 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66b55d5755-jzw5k"] Feb 01 08:43:55 crc kubenswrapper[5127]: E0201 08:43:55.393023 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecbfca9-cc14-4c56-8205-84f9e4f96b6e" containerName="mariadb-account-create-update" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.393044 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecbfca9-cc14-4c56-8205-84f9e4f96b6e" containerName="mariadb-account-create-update" Feb 01 08:43:55 crc kubenswrapper[5127]: E0201 08:43:55.393072 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="289df401-c7a9-4098-95d7-94dd5affe406" containerName="mariadb-database-create" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.393084 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="289df401-c7a9-4098-95d7-94dd5affe406" containerName="mariadb-database-create" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.393338 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecbfca9-cc14-4c56-8205-84f9e4f96b6e" containerName="mariadb-account-create-update" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.393360 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="289df401-c7a9-4098-95d7-94dd5affe406" containerName="mariadb-database-create" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.394627 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.416286 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b55d5755-jzw5k"] Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.426246 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-5bzw7"] Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.427706 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.440932 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m5fkg" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.441245 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.441392 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.471705 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5bzw7"] Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.473877 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-config\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.473916 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmwpz\" (UniqueName: \"kubernetes.io/projected/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-kube-api-access-lmwpz\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.474267 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-sb\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.474304 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-nb\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.474406 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-dns-svc\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.575627 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7c2m\" (UniqueName: \"kubernetes.io/projected/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-kube-api-access-f7c2m\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.575970 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-dns-svc\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.576852 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-dns-svc\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.576894 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-config\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.576922 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-config\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.576929 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-combined-ca-bundle\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.577006 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmwpz\" (UniqueName: \"kubernetes.io/projected/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-kube-api-access-lmwpz\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.577125 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-sb\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.577167 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-nb\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.577264 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-logs\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.577324 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-scripts\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.577575 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-config-data\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.577805 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-sb\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.578340 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-nb\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.599032 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmwpz\" (UniqueName: \"kubernetes.io/projected/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-kube-api-access-lmwpz\") pod \"dnsmasq-dns-66b55d5755-jzw5k\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.679001 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-logs\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.679055 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-scripts\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.679085 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-config-data\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.679111 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7c2m\" (UniqueName: \"kubernetes.io/projected/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-kube-api-access-f7c2m\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.679173 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-combined-ca-bundle\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.679838 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-logs\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.683931 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-scripts\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.686211 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-config-data\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.695529 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7c2m\" (UniqueName: \"kubernetes.io/projected/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-kube-api-access-f7c2m\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.699170 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-combined-ca-bundle\") pod \"placement-db-sync-5bzw7\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.714710 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:55 crc kubenswrapper[5127]: I0201 08:43:55.769072 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5bzw7" Feb 01 08:43:56 crc kubenswrapper[5127]: I0201 08:43:56.219940 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b55d5755-jzw5k"] Feb 01 08:43:56 crc kubenswrapper[5127]: W0201 08:43:56.221416 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c37fca_5c5d_4565_9e8a_8427b2aca11f.slice/crio-038ceb25d1790ca9632f5bbabef022604334aa12331209bbf8406b33abddb7fa WatchSource:0}: Error finding container 038ceb25d1790ca9632f5bbabef022604334aa12331209bbf8406b33abddb7fa: Status 404 returned error can't find the container with id 038ceb25d1790ca9632f5bbabef022604334aa12331209bbf8406b33abddb7fa Feb 01 08:43:56 crc kubenswrapper[5127]: I0201 08:43:56.307616 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5bzw7"] Feb 01 08:43:56 crc kubenswrapper[5127]: I0201 08:43:56.311160 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:43:57 crc kubenswrapper[5127]: I0201 08:43:57.103641 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5bzw7" event={"ID":"a21d88c9-c7f4-49dc-ae5d-cdae33667b35","Type":"ContainerStarted","Data":"3e43de0491095e15be167dd298797e4a6f7b93457d6e25cd8d2116a1eae31fe7"} Feb 01 08:43:57 crc kubenswrapper[5127]: I0201 08:43:57.107797 5127 generic.go:334] "Generic (PLEG): container finished" podID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerID="f2a7860d84c78a5bf8cf0ed725fb03d91047d611f29c3afab34dc05c0d2728b8" exitCode=0 Feb 01 08:43:57 crc kubenswrapper[5127]: I0201 08:43:57.107835 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" event={"ID":"c2c37fca-5c5d-4565-9e8a-8427b2aca11f","Type":"ContainerDied","Data":"f2a7860d84c78a5bf8cf0ed725fb03d91047d611f29c3afab34dc05c0d2728b8"} Feb 01 08:43:57 crc kubenswrapper[5127]: I0201 08:43:57.107860 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" event={"ID":"c2c37fca-5c5d-4565-9e8a-8427b2aca11f","Type":"ContainerStarted","Data":"038ceb25d1790ca9632f5bbabef022604334aa12331209bbf8406b33abddb7fa"} Feb 01 08:43:58 crc kubenswrapper[5127]: I0201 08:43:58.140880 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" event={"ID":"c2c37fca-5c5d-4565-9e8a-8427b2aca11f","Type":"ContainerStarted","Data":"03e6a4c7e6336369aa3fa6cb3c84e5fddce779cfc6a3c791cbf95ddc761dcd00"} Feb 01 08:43:58 crc kubenswrapper[5127]: I0201 08:43:58.141283 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:43:58 crc kubenswrapper[5127]: I0201 08:43:58.165215 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" podStartSLOduration=3.165194833 podStartE2EDuration="3.165194833s" podCreationTimestamp="2026-02-01 08:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:43:58.157127816 +0000 UTC m=+6988.643030179" watchObservedRunningTime="2026-02-01 08:43:58.165194833 +0000 UTC m=+6988.651097196" Feb 01 08:44:00 crc kubenswrapper[5127]: I0201 08:44:00.164511 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5bzw7" event={"ID":"a21d88c9-c7f4-49dc-ae5d-cdae33667b35","Type":"ContainerStarted","Data":"396befc3a955dc91cf4ddc074ef30fd9ae7cfc913425f90feb960d086eec57ab"} Feb 01 08:44:00 crc kubenswrapper[5127]: I0201 08:44:00.198863 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-5bzw7" podStartSLOduration=1.904412342 podStartE2EDuration="5.198841097s" podCreationTimestamp="2026-02-01 08:43:55 +0000 UTC" firstStartedPulling="2026-02-01 08:43:56.310805041 +0000 UTC m=+6986.796707414" lastFinishedPulling="2026-02-01 08:43:59.605233796 +0000 UTC m=+6990.091136169" observedRunningTime="2026-02-01 08:44:00.195300962 +0000 UTC m=+6990.681203385" watchObservedRunningTime="2026-02-01 08:44:00.198841097 +0000 UTC m=+6990.684743470" Feb 01 08:44:01 crc kubenswrapper[5127]: I0201 08:44:01.180149 5127 generic.go:334] "Generic (PLEG): container finished" podID="a21d88c9-c7f4-49dc-ae5d-cdae33667b35" containerID="396befc3a955dc91cf4ddc074ef30fd9ae7cfc913425f90feb960d086eec57ab" exitCode=0 Feb 01 08:44:01 crc kubenswrapper[5127]: I0201 08:44:01.180215 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5bzw7" event={"ID":"a21d88c9-c7f4-49dc-ae5d-cdae33667b35","Type":"ContainerDied","Data":"396befc3a955dc91cf4ddc074ef30fd9ae7cfc913425f90feb960d086eec57ab"} Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.614416 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5bzw7" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.632030 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-combined-ca-bundle\") pod \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.632389 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-config-data\") pod \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.633009 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7c2m\" (UniqueName: \"kubernetes.io/projected/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-kube-api-access-f7c2m\") pod \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.633118 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-scripts\") pod \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.633217 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-logs\") pod \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\" (UID: \"a21d88c9-c7f4-49dc-ae5d-cdae33667b35\") " Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.633899 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-logs" (OuterVolumeSpecName: "logs") pod "a21d88c9-c7f4-49dc-ae5d-cdae33667b35" (UID: "a21d88c9-c7f4-49dc-ae5d-cdae33667b35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.644888 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-scripts" (OuterVolumeSpecName: "scripts") pod "a21d88c9-c7f4-49dc-ae5d-cdae33667b35" (UID: "a21d88c9-c7f4-49dc-ae5d-cdae33667b35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.645662 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-kube-api-access-f7c2m" (OuterVolumeSpecName: "kube-api-access-f7c2m") pod "a21d88c9-c7f4-49dc-ae5d-cdae33667b35" (UID: "a21d88c9-c7f4-49dc-ae5d-cdae33667b35"). InnerVolumeSpecName "kube-api-access-f7c2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.681722 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a21d88c9-c7f4-49dc-ae5d-cdae33667b35" (UID: "a21d88c9-c7f4-49dc-ae5d-cdae33667b35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.697060 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-config-data" (OuterVolumeSpecName: "config-data") pod "a21d88c9-c7f4-49dc-ae5d-cdae33667b35" (UID: "a21d88c9-c7f4-49dc-ae5d-cdae33667b35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.735276 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.735320 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7c2m\" (UniqueName: \"kubernetes.io/projected/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-kube-api-access-f7c2m\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.735389 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.735401 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:02 crc kubenswrapper[5127]: I0201 08:44:02.735412 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d88c9-c7f4-49dc-ae5d-cdae33667b35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.202133 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5bzw7" event={"ID":"a21d88c9-c7f4-49dc-ae5d-cdae33667b35","Type":"ContainerDied","Data":"3e43de0491095e15be167dd298797e4a6f7b93457d6e25cd8d2116a1eae31fe7"} Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.202181 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e43de0491095e15be167dd298797e4a6f7b93457d6e25cd8d2116a1eae31fe7" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.202184 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5bzw7" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.242256 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:44:03 crc kubenswrapper[5127]: E0201 08:44:03.243037 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.301665 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-575cc7d444-lrng8"] Feb 01 08:44:03 crc kubenswrapper[5127]: E0201 08:44:03.302068 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21d88c9-c7f4-49dc-ae5d-cdae33667b35" containerName="placement-db-sync" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.302089 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21d88c9-c7f4-49dc-ae5d-cdae33667b35" containerName="placement-db-sync" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.302311 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21d88c9-c7f4-49dc-ae5d-cdae33667b35" containerName="placement-db-sync" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.303435 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.305339 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-m5fkg" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.305511 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.307810 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.322327 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-575cc7d444-lrng8"] Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.446870 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-config-data\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.447121 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ndgx\" (UniqueName: \"kubernetes.io/projected/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-kube-api-access-9ndgx\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.447601 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-logs\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.447679 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-scripts\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.447833 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-combined-ca-bundle\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.549616 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ndgx\" (UniqueName: \"kubernetes.io/projected/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-kube-api-access-9ndgx\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.549952 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-logs\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.549975 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-scripts\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.550009 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-combined-ca-bundle\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.550065 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-config-data\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.550338 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-logs\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.555991 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-config-data\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.559196 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-combined-ca-bundle\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.562327 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-scripts\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.576621 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ndgx\" (UniqueName: \"kubernetes.io/projected/4f22699e-8bf9-4bd5-8b56-6f2cab072f17-kube-api-access-9ndgx\") pod \"placement-575cc7d444-lrng8\" (UID: \"4f22699e-8bf9-4bd5-8b56-6f2cab072f17\") " pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:03 crc kubenswrapper[5127]: I0201 08:44:03.627010 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:04 crc kubenswrapper[5127]: I0201 08:44:04.127422 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-575cc7d444-lrng8"] Feb 01 08:44:04 crc kubenswrapper[5127]: I0201 08:44:04.213042 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575cc7d444-lrng8" event={"ID":"4f22699e-8bf9-4bd5-8b56-6f2cab072f17","Type":"ContainerStarted","Data":"683c08144487bae6316416e6d5118ad414401d07418515216bb8a6d3732caf84"} Feb 01 08:44:05 crc kubenswrapper[5127]: I0201 08:44:05.229464 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575cc7d444-lrng8" event={"ID":"4f22699e-8bf9-4bd5-8b56-6f2cab072f17","Type":"ContainerStarted","Data":"89cd07857dc5ccc7fd3b0974d7f6cef7d84a9afe9fd844133db37f5cfb2d6afe"} Feb 01 08:44:05 crc kubenswrapper[5127]: I0201 08:44:05.229862 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-575cc7d444-lrng8" event={"ID":"4f22699e-8bf9-4bd5-8b56-6f2cab072f17","Type":"ContainerStarted","Data":"0e6cbcd6585ef9d695201ff74c434ed39b13970287d59288a8a964909c488636"} Feb 01 08:44:05 crc kubenswrapper[5127]: I0201 08:44:05.229895 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:05 crc kubenswrapper[5127]: I0201 08:44:05.267052 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-575cc7d444-lrng8" podStartSLOduration=2.2670260239999998 podStartE2EDuration="2.267026024s" podCreationTimestamp="2026-02-01 08:44:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:44:05.257614911 +0000 UTC m=+6995.743517324" watchObservedRunningTime="2026-02-01 08:44:05.267026024 +0000 UTC m=+6995.752928417" Feb 01 08:44:05 crc kubenswrapper[5127]: I0201 08:44:05.716798 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:44:05 crc kubenswrapper[5127]: I0201 08:44:05.817878 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68586cb4b7-l6qxh"] Feb 01 08:44:05 crc kubenswrapper[5127]: I0201 08:44:05.818521 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" podUID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerName="dnsmasq-dns" containerID="cri-o://3d76621a2da79b532e43a5d3da4c6c029e842523aed07a14d7fe4f74068959b9" gracePeriod=10 Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.327572 5127 generic.go:334] "Generic (PLEG): container finished" podID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerID="3d76621a2da79b532e43a5d3da4c6c029e842523aed07a14d7fe4f74068959b9" exitCode=0 Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.342268 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.342312 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" event={"ID":"c35f18bb-68ff-4460-b5a6-848c47fc7495","Type":"ContainerDied","Data":"3d76621a2da79b532e43a5d3da4c6c029e842523aed07a14d7fe4f74068959b9"} Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.453639 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.611871 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-nb\") pod \"c35f18bb-68ff-4460-b5a6-848c47fc7495\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.611965 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-sb\") pod \"c35f18bb-68ff-4460-b5a6-848c47fc7495\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.612084 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-config\") pod \"c35f18bb-68ff-4460-b5a6-848c47fc7495\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.612159 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgszm\" (UniqueName: \"kubernetes.io/projected/c35f18bb-68ff-4460-b5a6-848c47fc7495-kube-api-access-cgszm\") pod \"c35f18bb-68ff-4460-b5a6-848c47fc7495\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.612307 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-dns-svc\") pod \"c35f18bb-68ff-4460-b5a6-848c47fc7495\" (UID: \"c35f18bb-68ff-4460-b5a6-848c47fc7495\") " Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.620942 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35f18bb-68ff-4460-b5a6-848c47fc7495-kube-api-access-cgszm" (OuterVolumeSpecName: "kube-api-access-cgszm") pod "c35f18bb-68ff-4460-b5a6-848c47fc7495" (UID: "c35f18bb-68ff-4460-b5a6-848c47fc7495"). InnerVolumeSpecName "kube-api-access-cgszm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.661625 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-config" (OuterVolumeSpecName: "config") pod "c35f18bb-68ff-4460-b5a6-848c47fc7495" (UID: "c35f18bb-68ff-4460-b5a6-848c47fc7495"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.668289 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c35f18bb-68ff-4460-b5a6-848c47fc7495" (UID: "c35f18bb-68ff-4460-b5a6-848c47fc7495"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.680244 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c35f18bb-68ff-4460-b5a6-848c47fc7495" (UID: "c35f18bb-68ff-4460-b5a6-848c47fc7495"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.702741 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c35f18bb-68ff-4460-b5a6-848c47fc7495" (UID: "c35f18bb-68ff-4460-b5a6-848c47fc7495"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.714187 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.714364 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.714490 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.714604 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgszm\" (UniqueName: \"kubernetes.io/projected/c35f18bb-68ff-4460-b5a6-848c47fc7495-kube-api-access-cgszm\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:06 crc kubenswrapper[5127]: I0201 08:44:06.714785 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c35f18bb-68ff-4460-b5a6-848c47fc7495-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:44:07 crc kubenswrapper[5127]: I0201 08:44:07.339183 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" event={"ID":"c35f18bb-68ff-4460-b5a6-848c47fc7495","Type":"ContainerDied","Data":"3ceda4b59d567870cd3f764f901dd241a5f35de012422db81150e194c1e453af"} Feb 01 08:44:07 crc kubenswrapper[5127]: I0201 08:44:07.339253 5127 scope.go:117] "RemoveContainer" containerID="3d76621a2da79b532e43a5d3da4c6c029e842523aed07a14d7fe4f74068959b9" Feb 01 08:44:07 crc kubenswrapper[5127]: I0201 08:44:07.339264 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68586cb4b7-l6qxh" Feb 01 08:44:07 crc kubenswrapper[5127]: I0201 08:44:07.369696 5127 scope.go:117] "RemoveContainer" containerID="53fd0def7484ba67a38abe6568e41841f0f81ebfb4c6f49ec5fa40a2553d0053" Feb 01 08:44:07 crc kubenswrapper[5127]: I0201 08:44:07.402004 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68586cb4b7-l6qxh"] Feb 01 08:44:07 crc kubenswrapper[5127]: I0201 08:44:07.408542 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68586cb4b7-l6qxh"] Feb 01 08:44:08 crc kubenswrapper[5127]: I0201 08:44:08.250434 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35f18bb-68ff-4460-b5a6-848c47fc7495" path="/var/lib/kubelet/pods/c35f18bb-68ff-4460-b5a6-848c47fc7495/volumes" Feb 01 08:44:16 crc kubenswrapper[5127]: I0201 08:44:16.237410 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:44:16 crc kubenswrapper[5127]: E0201 08:44:16.238059 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:44:31 crc kubenswrapper[5127]: I0201 08:44:31.236338 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:44:31 crc kubenswrapper[5127]: E0201 08:44:31.237258 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:44:34 crc kubenswrapper[5127]: I0201 08:44:34.613023 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:34 crc kubenswrapper[5127]: I0201 08:44:34.617351 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-575cc7d444-lrng8" Feb 01 08:44:43 crc kubenswrapper[5127]: I0201 08:44:43.236665 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:44:43 crc kubenswrapper[5127]: I0201 08:44:43.995832 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"b64404fcebafdcf797cccad51367be97e4707a6d126b6520c7afde58af417411"} Feb 01 08:44:58 crc kubenswrapper[5127]: I0201 08:44:58.957530 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-lgdpw"] Feb 01 08:44:58 crc kubenswrapper[5127]: E0201 08:44:58.958307 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerName="init" Feb 01 08:44:58 crc kubenswrapper[5127]: I0201 08:44:58.958320 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerName="init" Feb 01 08:44:58 crc kubenswrapper[5127]: E0201 08:44:58.958339 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerName="dnsmasq-dns" Feb 01 08:44:58 crc kubenswrapper[5127]: I0201 08:44:58.958345 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerName="dnsmasq-dns" Feb 01 08:44:58 crc kubenswrapper[5127]: I0201 08:44:58.958497 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c35f18bb-68ff-4460-b5a6-848c47fc7495" containerName="dnsmasq-dns" Feb 01 08:44:58 crc kubenswrapper[5127]: I0201 08:44:58.959035 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:58 crc kubenswrapper[5127]: I0201 08:44:58.967894 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lgdpw"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.034476 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7s9tm"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.039873 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.052641 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7s9tm"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.060184 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-dc19-account-create-update-4lrx5"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.061305 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.064117 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.071801 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dc19-account-create-update-4lrx5"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.073509 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjp5\" (UniqueName: \"kubernetes.io/projected/f8550964-0e27-4d3e-a09c-3cad9587955f-kube-api-access-sdjp5\") pod \"nova-api-db-create-lgdpw\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.073577 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8550964-0e27-4d3e-a09c-3cad9587955f-operator-scripts\") pod \"nova-api-db-create-lgdpw\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.176375 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4klj\" (UniqueName: \"kubernetes.io/projected/2928aa86-c015-4379-b5d5-2254d8ca6989-kube-api-access-d4klj\") pod \"nova-api-dc19-account-create-update-4lrx5\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.176547 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhn8r\" (UniqueName: \"kubernetes.io/projected/e05ff55a-0cef-4b41-aa86-84f55b33de4c-kube-api-access-bhn8r\") pod \"nova-cell0-db-create-7s9tm\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.176621 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05ff55a-0cef-4b41-aa86-84f55b33de4c-operator-scripts\") pod \"nova-cell0-db-create-7s9tm\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.176667 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2928aa86-c015-4379-b5d5-2254d8ca6989-operator-scripts\") pod \"nova-api-dc19-account-create-update-4lrx5\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.176813 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjp5\" (UniqueName: \"kubernetes.io/projected/f8550964-0e27-4d3e-a09c-3cad9587955f-kube-api-access-sdjp5\") pod \"nova-api-db-create-lgdpw\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.176889 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8550964-0e27-4d3e-a09c-3cad9587955f-operator-scripts\") pod \"nova-api-db-create-lgdpw\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.177651 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8550964-0e27-4d3e-a09c-3cad9587955f-operator-scripts\") pod \"nova-api-db-create-lgdpw\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.210376 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjp5\" (UniqueName: \"kubernetes.io/projected/f8550964-0e27-4d3e-a09c-3cad9587955f-kube-api-access-sdjp5\") pod \"nova-api-db-create-lgdpw\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.238130 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-2xclr"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.239200 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.245761 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6cad-account-create-update-kbxdv"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.246855 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.248498 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.262289 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6cad-account-create-update-kbxdv"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.279506 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4klj\" (UniqueName: \"kubernetes.io/projected/2928aa86-c015-4379-b5d5-2254d8ca6989-kube-api-access-d4klj\") pod \"nova-api-dc19-account-create-update-4lrx5\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.279646 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhn8r\" (UniqueName: \"kubernetes.io/projected/e05ff55a-0cef-4b41-aa86-84f55b33de4c-kube-api-access-bhn8r\") pod \"nova-cell0-db-create-7s9tm\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.279679 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05ff55a-0cef-4b41-aa86-84f55b33de4c-operator-scripts\") pod \"nova-cell0-db-create-7s9tm\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.279712 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2928aa86-c015-4379-b5d5-2254d8ca6989-operator-scripts\") pod \"nova-api-dc19-account-create-update-4lrx5\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.279745 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7nt8\" (UniqueName: \"kubernetes.io/projected/e6f0fdf1-acde-4ef1-af94-afed7e87232c-kube-api-access-d7nt8\") pod \"nova-cell1-db-create-2xclr\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.279825 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6f0fdf1-acde-4ef1-af94-afed7e87232c-operator-scripts\") pod \"nova-cell1-db-create-2xclr\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.281410 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05ff55a-0cef-4b41-aa86-84f55b33de4c-operator-scripts\") pod \"nova-cell0-db-create-7s9tm\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.281604 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2928aa86-c015-4379-b5d5-2254d8ca6989-operator-scripts\") pod \"nova-api-dc19-account-create-update-4lrx5\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.290256 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.295829 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2xclr"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.299811 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhn8r\" (UniqueName: \"kubernetes.io/projected/e05ff55a-0cef-4b41-aa86-84f55b33de4c-kube-api-access-bhn8r\") pod \"nova-cell0-db-create-7s9tm\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.302279 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4klj\" (UniqueName: \"kubernetes.io/projected/2928aa86-c015-4379-b5d5-2254d8ca6989-kube-api-access-d4klj\") pod \"nova-api-dc19-account-create-update-4lrx5\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.360145 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.381100 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6f0fdf1-acde-4ef1-af94-afed7e87232c-operator-scripts\") pod \"nova-cell1-db-create-2xclr\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.381144 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsrzg\" (UniqueName: \"kubernetes.io/projected/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-kube-api-access-wsrzg\") pod \"nova-cell0-6cad-account-create-update-kbxdv\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.381171 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-operator-scripts\") pod \"nova-cell0-6cad-account-create-update-kbxdv\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.381887 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6f0fdf1-acde-4ef1-af94-afed7e87232c-operator-scripts\") pod \"nova-cell1-db-create-2xclr\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.381999 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7nt8\" (UniqueName: \"kubernetes.io/projected/e6f0fdf1-acde-4ef1-af94-afed7e87232c-kube-api-access-d7nt8\") pod \"nova-cell1-db-create-2xclr\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.382789 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.401941 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7nt8\" (UniqueName: \"kubernetes.io/projected/e6f0fdf1-acde-4ef1-af94-afed7e87232c-kube-api-access-d7nt8\") pod \"nova-cell1-db-create-2xclr\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.458738 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2cf8-account-create-update-qhmcg"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.460138 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.463788 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.468982 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2cf8-account-create-update-qhmcg"] Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.484060 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsrzg\" (UniqueName: \"kubernetes.io/projected/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-kube-api-access-wsrzg\") pod \"nova-cell0-6cad-account-create-update-kbxdv\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.484115 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-operator-scripts\") pod \"nova-cell0-6cad-account-create-update-kbxdv\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.484844 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-operator-scripts\") pod \"nova-cell0-6cad-account-create-update-kbxdv\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.504228 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsrzg\" (UniqueName: \"kubernetes.io/projected/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-kube-api-access-wsrzg\") pod \"nova-cell0-6cad-account-create-update-kbxdv\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.573826 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.585491 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.586448 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14185d1-b786-48b9-ad05-165029edace5-operator-scripts\") pod \"nova-cell1-2cf8-account-create-update-qhmcg\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.586743 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r6ng\" (UniqueName: \"kubernetes.io/projected/a14185d1-b786-48b9-ad05-165029edace5-kube-api-access-6r6ng\") pod \"nova-cell1-2cf8-account-create-update-qhmcg\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.688097 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14185d1-b786-48b9-ad05-165029edace5-operator-scripts\") pod \"nova-cell1-2cf8-account-create-update-qhmcg\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.688248 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r6ng\" (UniqueName: \"kubernetes.io/projected/a14185d1-b786-48b9-ad05-165029edace5-kube-api-access-6r6ng\") pod \"nova-cell1-2cf8-account-create-update-qhmcg\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.689000 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14185d1-b786-48b9-ad05-165029edace5-operator-scripts\") pod \"nova-cell1-2cf8-account-create-update-qhmcg\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.708366 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r6ng\" (UniqueName: \"kubernetes.io/projected/a14185d1-b786-48b9-ad05-165029edace5-kube-api-access-6r6ng\") pod \"nova-cell1-2cf8-account-create-update-qhmcg\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.794956 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.842575 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-lgdpw"] Feb 01 08:44:59 crc kubenswrapper[5127]: W0201 08:44:59.846811 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8550964_0e27_4d3e_a09c_3cad9587955f.slice/crio-e182a7d0b75e4f037e6c7348f2e96b6fa562b24388c544fe3485f13e4df3be2b WatchSource:0}: Error finding container e182a7d0b75e4f037e6c7348f2e96b6fa562b24388c544fe3485f13e4df3be2b: Status 404 returned error can't find the container with id e182a7d0b75e4f037e6c7348f2e96b6fa562b24388c544fe3485f13e4df3be2b Feb 01 08:44:59 crc kubenswrapper[5127]: I0201 08:44:59.935966 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7s9tm"] Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.017865 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dc19-account-create-update-4lrx5"] Feb 01 08:45:00 crc kubenswrapper[5127]: W0201 08:45:00.060458 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2928aa86_c015_4379_b5d5_2254d8ca6989.slice/crio-7d95f2f44c78f0072eb0c1a36a0942ba96e4d120b697ec4255b9333e83b01369 WatchSource:0}: Error finding container 7d95f2f44c78f0072eb0c1a36a0942ba96e4d120b697ec4255b9333e83b01369: Status 404 returned error can't find the container with id 7d95f2f44c78f0072eb0c1a36a0942ba96e4d120b697ec4255b9333e83b01369 Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.099451 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-2xclr"] Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.115876 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2cf8-account-create-update-qhmcg"] Feb 01 08:45:00 crc kubenswrapper[5127]: W0201 08:45:00.117122 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6f0fdf1_acde_4ef1_af94_afed7e87232c.slice/crio-9da62f19efb0c153bdcddf2ab684ac41db69fc4d35b062ac6e2ba91ef9b87b76 WatchSource:0}: Error finding container 9da62f19efb0c153bdcddf2ab684ac41db69fc4d35b062ac6e2ba91ef9b87b76: Status 404 returned error can't find the container with id 9da62f19efb0c153bdcddf2ab684ac41db69fc4d35b062ac6e2ba91ef9b87b76 Feb 01 08:45:00 crc kubenswrapper[5127]: W0201 08:45:00.122411 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda14185d1_b786_48b9_ad05_165029edace5.slice/crio-865946e274ceae30d6643a3affcb3f8a8eec3b288ef9a1631764458b2fd986b5 WatchSource:0}: Error finding container 865946e274ceae30d6643a3affcb3f8a8eec3b288ef9a1631764458b2fd986b5: Status 404 returned error can't find the container with id 865946e274ceae30d6643a3affcb3f8a8eec3b288ef9a1631764458b2fd986b5 Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.147279 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l"] Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.152463 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.155278 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.155474 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.157087 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l"] Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.184554 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lgdpw" event={"ID":"f8550964-0e27-4d3e-a09c-3cad9587955f","Type":"ContainerStarted","Data":"05c491ed359c2ffac9d5cc48b0d73a4a5647f2064cdacfebdaa40d3af42122aa"} Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.184634 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lgdpw" event={"ID":"f8550964-0e27-4d3e-a09c-3cad9587955f","Type":"ContainerStarted","Data":"e182a7d0b75e4f037e6c7348f2e96b6fa562b24388c544fe3485f13e4df3be2b"} Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.201627 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82a02b0a-9955-45c6-af32-4b7aab3de4cf-secret-volume\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.201779 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82a02b0a-9955-45c6-af32-4b7aab3de4cf-config-volume\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.201865 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b7qd\" (UniqueName: \"kubernetes.io/projected/82a02b0a-9955-45c6-af32-4b7aab3de4cf-kube-api-access-6b7qd\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.211993 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2xclr" event={"ID":"e6f0fdf1-acde-4ef1-af94-afed7e87232c","Type":"ContainerStarted","Data":"9da62f19efb0c153bdcddf2ab684ac41db69fc4d35b062ac6e2ba91ef9b87b76"} Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.218918 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6cad-account-create-update-kbxdv"] Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.226337 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" event={"ID":"a14185d1-b786-48b9-ad05-165029edace5","Type":"ContainerStarted","Data":"865946e274ceae30d6643a3affcb3f8a8eec3b288ef9a1631764458b2fd986b5"} Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.232034 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dc19-account-create-update-4lrx5" event={"ID":"2928aa86-c015-4379-b5d5-2254d8ca6989","Type":"ContainerStarted","Data":"7d95f2f44c78f0072eb0c1a36a0942ba96e4d120b697ec4255b9333e83b01369"} Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.233134 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7s9tm" event={"ID":"e05ff55a-0cef-4b41-aa86-84f55b33de4c","Type":"ContainerStarted","Data":"e7b75988749b05f8c5324faefccbe6122960b33db866cdca50805b8ca0257587"} Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.233308 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7s9tm" event={"ID":"e05ff55a-0cef-4b41-aa86-84f55b33de4c","Type":"ContainerStarted","Data":"625ab0227b4b323613fe10c6a7be97af013699cc3af6097d91a8988a1e5484ca"} Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.233927 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-lgdpw" podStartSLOduration=2.233902551 podStartE2EDuration="2.233902551s" podCreationTimestamp="2026-02-01 08:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:00.200115143 +0000 UTC m=+7050.686017516" watchObservedRunningTime="2026-02-01 08:45:00.233902551 +0000 UTC m=+7050.719804914" Feb 01 08:45:00 crc kubenswrapper[5127]: W0201 08:45:00.242524 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea8879f4_10bb_47f7_a83e_d35d8c6b9f8a.slice/crio-190e10bbcd88a08aa0464357195fa87dde569d771e1ec15a7c36620ca9ac6d2d WatchSource:0}: Error finding container 190e10bbcd88a08aa0464357195fa87dde569d771e1ec15a7c36620ca9ac6d2d: Status 404 returned error can't find the container with id 190e10bbcd88a08aa0464357195fa87dde569d771e1ec15a7c36620ca9ac6d2d Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.297484 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-7s9tm" podStartSLOduration=1.297460698 podStartE2EDuration="1.297460698s" podCreationTimestamp="2026-02-01 08:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:00.262456248 +0000 UTC m=+7050.748358611" watchObservedRunningTime="2026-02-01 08:45:00.297460698 +0000 UTC m=+7050.783363061" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.305044 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82a02b0a-9955-45c6-af32-4b7aab3de4cf-secret-volume\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.305221 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82a02b0a-9955-45c6-af32-4b7aab3de4cf-config-volume\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.305289 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b7qd\" (UniqueName: \"kubernetes.io/projected/82a02b0a-9955-45c6-af32-4b7aab3de4cf-kube-api-access-6b7qd\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.309673 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82a02b0a-9955-45c6-af32-4b7aab3de4cf-config-volume\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.310216 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82a02b0a-9955-45c6-af32-4b7aab3de4cf-secret-volume\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.325601 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b7qd\" (UniqueName: \"kubernetes.io/projected/82a02b0a-9955-45c6-af32-4b7aab3de4cf-kube-api-access-6b7qd\") pod \"collect-profiles-29498925-dqn9l\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.492933 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:00 crc kubenswrapper[5127]: I0201 08:45:00.954550 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l"] Feb 01 08:45:00 crc kubenswrapper[5127]: W0201 08:45:00.992966 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82a02b0a_9955_45c6_af32_4b7aab3de4cf.slice/crio-50aa1b51ea276b48370529071dd6ba1ac852530d4476cf78094adabd8210ec26 WatchSource:0}: Error finding container 50aa1b51ea276b48370529071dd6ba1ac852530d4476cf78094adabd8210ec26: Status 404 returned error can't find the container with id 50aa1b51ea276b48370529071dd6ba1ac852530d4476cf78094adabd8210ec26 Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.247427 5127 generic.go:334] "Generic (PLEG): container finished" podID="a14185d1-b786-48b9-ad05-165029edace5" containerID="4626eaaf3f5bed3739a044ecda59c564da01048ffc6d767addd4cf9b14c7e640" exitCode=0 Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.247532 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" event={"ID":"a14185d1-b786-48b9-ad05-165029edace5","Type":"ContainerDied","Data":"4626eaaf3f5bed3739a044ecda59c564da01048ffc6d767addd4cf9b14c7e640"} Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.254674 5127 generic.go:334] "Generic (PLEG): container finished" podID="2928aa86-c015-4379-b5d5-2254d8ca6989" containerID="7aa19b3a25c83dda21ca109c6fb0130944d644f9913ed7a748ed4326fb3270b3" exitCode=0 Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.254772 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dc19-account-create-update-4lrx5" event={"ID":"2928aa86-c015-4379-b5d5-2254d8ca6989","Type":"ContainerDied","Data":"7aa19b3a25c83dda21ca109c6fb0130944d644f9913ed7a748ed4326fb3270b3"} Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.256206 5127 generic.go:334] "Generic (PLEG): container finished" podID="ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a" containerID="05c145723f144367b651cce19e3638fabdd06be8bd2ac3ec690c96748f0ac662" exitCode=0 Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.256274 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" event={"ID":"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a","Type":"ContainerDied","Data":"05c145723f144367b651cce19e3638fabdd06be8bd2ac3ec690c96748f0ac662"} Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.256292 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" event={"ID":"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a","Type":"ContainerStarted","Data":"190e10bbcd88a08aa0464357195fa87dde569d771e1ec15a7c36620ca9ac6d2d"} Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.257078 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" event={"ID":"82a02b0a-9955-45c6-af32-4b7aab3de4cf","Type":"ContainerStarted","Data":"50aa1b51ea276b48370529071dd6ba1ac852530d4476cf78094adabd8210ec26"} Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.258705 5127 generic.go:334] "Generic (PLEG): container finished" podID="e05ff55a-0cef-4b41-aa86-84f55b33de4c" containerID="e7b75988749b05f8c5324faefccbe6122960b33db866cdca50805b8ca0257587" exitCode=0 Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.258742 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7s9tm" event={"ID":"e05ff55a-0cef-4b41-aa86-84f55b33de4c","Type":"ContainerDied","Data":"e7b75988749b05f8c5324faefccbe6122960b33db866cdca50805b8ca0257587"} Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.260017 5127 generic.go:334] "Generic (PLEG): container finished" podID="f8550964-0e27-4d3e-a09c-3cad9587955f" containerID="05c491ed359c2ffac9d5cc48b0d73a4a5647f2064cdacfebdaa40d3af42122aa" exitCode=0 Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.260088 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lgdpw" event={"ID":"f8550964-0e27-4d3e-a09c-3cad9587955f","Type":"ContainerDied","Data":"05c491ed359c2ffac9d5cc48b0d73a4a5647f2064cdacfebdaa40d3af42122aa"} Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.261692 5127 generic.go:334] "Generic (PLEG): container finished" podID="e6f0fdf1-acde-4ef1-af94-afed7e87232c" containerID="ffafe99877109ce63df8f9a8a6d8496ebe38981275b62aebd8e56368fb78b7ea" exitCode=0 Feb 01 08:45:01 crc kubenswrapper[5127]: I0201 08:45:01.261722 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2xclr" event={"ID":"e6f0fdf1-acde-4ef1-af94-afed7e87232c","Type":"ContainerDied","Data":"ffafe99877109ce63df8f9a8a6d8496ebe38981275b62aebd8e56368fb78b7ea"} Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.276638 5127 generic.go:334] "Generic (PLEG): container finished" podID="82a02b0a-9955-45c6-af32-4b7aab3de4cf" containerID="ac7eb1a45ea242f6bb4b653b4e2f00135d4457b2c2ccfccd6661267abf08e1ef" exitCode=0 Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.276768 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" event={"ID":"82a02b0a-9955-45c6-af32-4b7aab3de4cf","Type":"ContainerDied","Data":"ac7eb1a45ea242f6bb4b653b4e2f00135d4457b2c2ccfccd6661267abf08e1ef"} Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.762163 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.884984 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6f0fdf1-acde-4ef1-af94-afed7e87232c-operator-scripts\") pod \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.885411 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7nt8\" (UniqueName: \"kubernetes.io/projected/e6f0fdf1-acde-4ef1-af94-afed7e87232c-kube-api-access-d7nt8\") pod \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\" (UID: \"e6f0fdf1-acde-4ef1-af94-afed7e87232c\") " Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.885837 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6f0fdf1-acde-4ef1-af94-afed7e87232c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6f0fdf1-acde-4ef1-af94-afed7e87232c" (UID: "e6f0fdf1-acde-4ef1-af94-afed7e87232c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.891311 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6f0fdf1-acde-4ef1-af94-afed7e87232c-kube-api-access-d7nt8" (OuterVolumeSpecName: "kube-api-access-d7nt8") pod "e6f0fdf1-acde-4ef1-af94-afed7e87232c" (UID: "e6f0fdf1-acde-4ef1-af94-afed7e87232c"). InnerVolumeSpecName "kube-api-access-d7nt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.960000 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.965695 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.973538 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.984047 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.990523 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.992366 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6f0fdf1-acde-4ef1-af94-afed7e87232c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:02 crc kubenswrapper[5127]: I0201 08:45:02.992428 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7nt8\" (UniqueName: \"kubernetes.io/projected/e6f0fdf1-acde-4ef1-af94-afed7e87232c-kube-api-access-d7nt8\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093089 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r6ng\" (UniqueName: \"kubernetes.io/projected/a14185d1-b786-48b9-ad05-165029edace5-kube-api-access-6r6ng\") pod \"a14185d1-b786-48b9-ad05-165029edace5\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093171 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsrzg\" (UniqueName: \"kubernetes.io/projected/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-kube-api-access-wsrzg\") pod \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093220 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8550964-0e27-4d3e-a09c-3cad9587955f-operator-scripts\") pod \"f8550964-0e27-4d3e-a09c-3cad9587955f\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093249 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhn8r\" (UniqueName: \"kubernetes.io/projected/e05ff55a-0cef-4b41-aa86-84f55b33de4c-kube-api-access-bhn8r\") pod \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093798 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8550964-0e27-4d3e-a09c-3cad9587955f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8550964-0e27-4d3e-a09c-3cad9587955f" (UID: "f8550964-0e27-4d3e-a09c-3cad9587955f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093880 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4klj\" (UniqueName: \"kubernetes.io/projected/2928aa86-c015-4379-b5d5-2254d8ca6989-kube-api-access-d4klj\") pod \"2928aa86-c015-4379-b5d5-2254d8ca6989\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093922 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdjp5\" (UniqueName: \"kubernetes.io/projected/f8550964-0e27-4d3e-a09c-3cad9587955f-kube-api-access-sdjp5\") pod \"f8550964-0e27-4d3e-a09c-3cad9587955f\" (UID: \"f8550964-0e27-4d3e-a09c-3cad9587955f\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.093956 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05ff55a-0cef-4b41-aa86-84f55b33de4c-operator-scripts\") pod \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\" (UID: \"e05ff55a-0cef-4b41-aa86-84f55b33de4c\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.094066 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-operator-scripts\") pod \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\" (UID: \"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.094089 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14185d1-b786-48b9-ad05-165029edace5-operator-scripts\") pod \"a14185d1-b786-48b9-ad05-165029edace5\" (UID: \"a14185d1-b786-48b9-ad05-165029edace5\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.094142 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2928aa86-c015-4379-b5d5-2254d8ca6989-operator-scripts\") pod \"2928aa86-c015-4379-b5d5-2254d8ca6989\" (UID: \"2928aa86-c015-4379-b5d5-2254d8ca6989\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.094661 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14185d1-b786-48b9-ad05-165029edace5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a14185d1-b786-48b9-ad05-165029edace5" (UID: "a14185d1-b786-48b9-ad05-165029edace5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.094684 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2928aa86-c015-4379-b5d5-2254d8ca6989-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2928aa86-c015-4379-b5d5-2254d8ca6989" (UID: "2928aa86-c015-4379-b5d5-2254d8ca6989"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.094727 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a" (UID: "ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.094811 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e05ff55a-0cef-4b41-aa86-84f55b33de4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e05ff55a-0cef-4b41-aa86-84f55b33de4c" (UID: "e05ff55a-0cef-4b41-aa86-84f55b33de4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.095098 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e05ff55a-0cef-4b41-aa86-84f55b33de4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.095125 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.095136 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14185d1-b786-48b9-ad05-165029edace5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.095145 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2928aa86-c015-4379-b5d5-2254d8ca6989-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.095154 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8550964-0e27-4d3e-a09c-3cad9587955f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.097285 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2928aa86-c015-4379-b5d5-2254d8ca6989-kube-api-access-d4klj" (OuterVolumeSpecName: "kube-api-access-d4klj") pod "2928aa86-c015-4379-b5d5-2254d8ca6989" (UID: "2928aa86-c015-4379-b5d5-2254d8ca6989"). InnerVolumeSpecName "kube-api-access-d4klj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.097385 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14185d1-b786-48b9-ad05-165029edace5-kube-api-access-6r6ng" (OuterVolumeSpecName: "kube-api-access-6r6ng") pod "a14185d1-b786-48b9-ad05-165029edace5" (UID: "a14185d1-b786-48b9-ad05-165029edace5"). InnerVolumeSpecName "kube-api-access-6r6ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.097444 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8550964-0e27-4d3e-a09c-3cad9587955f-kube-api-access-sdjp5" (OuterVolumeSpecName: "kube-api-access-sdjp5") pod "f8550964-0e27-4d3e-a09c-3cad9587955f" (UID: "f8550964-0e27-4d3e-a09c-3cad9587955f"). InnerVolumeSpecName "kube-api-access-sdjp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.097462 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05ff55a-0cef-4b41-aa86-84f55b33de4c-kube-api-access-bhn8r" (OuterVolumeSpecName: "kube-api-access-bhn8r") pod "e05ff55a-0cef-4b41-aa86-84f55b33de4c" (UID: "e05ff55a-0cef-4b41-aa86-84f55b33de4c"). InnerVolumeSpecName "kube-api-access-bhn8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.097974 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-kube-api-access-wsrzg" (OuterVolumeSpecName: "kube-api-access-wsrzg") pod "ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a" (UID: "ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a"). InnerVolumeSpecName "kube-api-access-wsrzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.196746 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdjp5\" (UniqueName: \"kubernetes.io/projected/f8550964-0e27-4d3e-a09c-3cad9587955f-kube-api-access-sdjp5\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.196786 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r6ng\" (UniqueName: \"kubernetes.io/projected/a14185d1-b786-48b9-ad05-165029edace5-kube-api-access-6r6ng\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.196794 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsrzg\" (UniqueName: \"kubernetes.io/projected/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a-kube-api-access-wsrzg\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.196805 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhn8r\" (UniqueName: \"kubernetes.io/projected/e05ff55a-0cef-4b41-aa86-84f55b33de4c-kube-api-access-bhn8r\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.196816 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4klj\" (UniqueName: \"kubernetes.io/projected/2928aa86-c015-4379-b5d5-2254d8ca6989-kube-api-access-d4klj\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.301331 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dc19-account-create-update-4lrx5" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.301338 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dc19-account-create-update-4lrx5" event={"ID":"2928aa86-c015-4379-b5d5-2254d8ca6989","Type":"ContainerDied","Data":"7d95f2f44c78f0072eb0c1a36a0942ba96e4d120b697ec4255b9333e83b01369"} Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.301419 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d95f2f44c78f0072eb0c1a36a0942ba96e4d120b697ec4255b9333e83b01369" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.304109 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.304155 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6cad-account-create-update-kbxdv" event={"ID":"ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a","Type":"ContainerDied","Data":"190e10bbcd88a08aa0464357195fa87dde569d771e1ec15a7c36620ca9ac6d2d"} Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.304330 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="190e10bbcd88a08aa0464357195fa87dde569d771e1ec15a7c36620ca9ac6d2d" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.305845 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-lgdpw" event={"ID":"f8550964-0e27-4d3e-a09c-3cad9587955f","Type":"ContainerDied","Data":"e182a7d0b75e4f037e6c7348f2e96b6fa562b24388c544fe3485f13e4df3be2b"} Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.305894 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e182a7d0b75e4f037e6c7348f2e96b6fa562b24388c544fe3485f13e4df3be2b" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.306429 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-lgdpw" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.307849 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7s9tm" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.307868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7s9tm" event={"ID":"e05ff55a-0cef-4b41-aa86-84f55b33de4c","Type":"ContainerDied","Data":"625ab0227b4b323613fe10c6a7be97af013699cc3af6097d91a8988a1e5484ca"} Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.307910 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625ab0227b4b323613fe10c6a7be97af013699cc3af6097d91a8988a1e5484ca" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.309993 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-2xclr" event={"ID":"e6f0fdf1-acde-4ef1-af94-afed7e87232c","Type":"ContainerDied","Data":"9da62f19efb0c153bdcddf2ab684ac41db69fc4d35b062ac6e2ba91ef9b87b76"} Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.310227 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9da62f19efb0c153bdcddf2ab684ac41db69fc4d35b062ac6e2ba91ef9b87b76" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.310027 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-2xclr" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.312730 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.314564 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2cf8-account-create-update-qhmcg" event={"ID":"a14185d1-b786-48b9-ad05-165029edace5","Type":"ContainerDied","Data":"865946e274ceae30d6643a3affcb3f8a8eec3b288ef9a1631764458b2fd986b5"} Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.314629 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865946e274ceae30d6643a3affcb3f8a8eec3b288ef9a1631764458b2fd986b5" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.584692 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.707531 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82a02b0a-9955-45c6-af32-4b7aab3de4cf-secret-volume\") pod \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.707961 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82a02b0a-9955-45c6-af32-4b7aab3de4cf-config-volume\") pod \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.708179 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b7qd\" (UniqueName: \"kubernetes.io/projected/82a02b0a-9955-45c6-af32-4b7aab3de4cf-kube-api-access-6b7qd\") pod \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\" (UID: \"82a02b0a-9955-45c6-af32-4b7aab3de4cf\") " Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.709401 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a02b0a-9955-45c6-af32-4b7aab3de4cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "82a02b0a-9955-45c6-af32-4b7aab3de4cf" (UID: "82a02b0a-9955-45c6-af32-4b7aab3de4cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.712725 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a02b0a-9955-45c6-af32-4b7aab3de4cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "82a02b0a-9955-45c6-af32-4b7aab3de4cf" (UID: "82a02b0a-9955-45c6-af32-4b7aab3de4cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.713417 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a02b0a-9955-45c6-af32-4b7aab3de4cf-kube-api-access-6b7qd" (OuterVolumeSpecName: "kube-api-access-6b7qd") pod "82a02b0a-9955-45c6-af32-4b7aab3de4cf" (UID: "82a02b0a-9955-45c6-af32-4b7aab3de4cf"). InnerVolumeSpecName "kube-api-access-6b7qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.812830 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/82a02b0a-9955-45c6-af32-4b7aab3de4cf-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.812867 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/82a02b0a-9955-45c6-af32-4b7aab3de4cf-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:03 crc kubenswrapper[5127]: I0201 08:45:03.812879 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b7qd\" (UniqueName: \"kubernetes.io/projected/82a02b0a-9955-45c6-af32-4b7aab3de4cf-kube-api-access-6b7qd\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.321903 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" event={"ID":"82a02b0a-9955-45c6-af32-4b7aab3de4cf","Type":"ContainerDied","Data":"50aa1b51ea276b48370529071dd6ba1ac852530d4476cf78094adabd8210ec26"} Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.321940 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50aa1b51ea276b48370529071dd6ba1ac852530d4476cf78094adabd8210ec26" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.322031 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.549451 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvj6t"] Feb 01 08:45:04 crc kubenswrapper[5127]: E0201 08:45:04.549876 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14185d1-b786-48b9-ad05-165029edace5" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.549899 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14185d1-b786-48b9-ad05-165029edace5" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: E0201 08:45:04.549918 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8550964-0e27-4d3e-a09c-3cad9587955f" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.549927 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8550964-0e27-4d3e-a09c-3cad9587955f" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: E0201 08:45:04.549955 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a02b0a-9955-45c6-af32-4b7aab3de4cf" containerName="collect-profiles" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.549963 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a02b0a-9955-45c6-af32-4b7aab3de4cf" containerName="collect-profiles" Feb 01 08:45:04 crc kubenswrapper[5127]: E0201 08:45:04.549978 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6f0fdf1-acde-4ef1-af94-afed7e87232c" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.549987 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6f0fdf1-acde-4ef1-af94-afed7e87232c" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: E0201 08:45:04.550003 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2928aa86-c015-4379-b5d5-2254d8ca6989" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550011 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2928aa86-c015-4379-b5d5-2254d8ca6989" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: E0201 08:45:04.550025 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05ff55a-0cef-4b41-aa86-84f55b33de4c" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550033 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05ff55a-0cef-4b41-aa86-84f55b33de4c" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: E0201 08:45:04.550044 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550052 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550257 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8550964-0e27-4d3e-a09c-3cad9587955f" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550281 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14185d1-b786-48b9-ad05-165029edace5" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550295 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05ff55a-0cef-4b41-aa86-84f55b33de4c" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550309 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2928aa86-c015-4379-b5d5-2254d8ca6989" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550327 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a02b0a-9955-45c6-af32-4b7aab3de4cf" containerName="collect-profiles" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550339 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a" containerName="mariadb-account-create-update" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.550352 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6f0fdf1-acde-4ef1-af94-afed7e87232c" containerName="mariadb-database-create" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.551169 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.553783 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.554191 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-78zdj" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.554511 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.573392 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvj6t"] Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.629269 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-config-data\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.629360 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.629383 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-scripts\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.629400 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9mdg\" (UniqueName: \"kubernetes.io/projected/f1f3fda8-50b7-40be-9546-3d0c11bb1896-kube-api-access-s9mdg\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.663997 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb"] Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.678132 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498880-66frb"] Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.730973 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.731032 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-scripts\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.731060 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9mdg\" (UniqueName: \"kubernetes.io/projected/f1f3fda8-50b7-40be-9546-3d0c11bb1896-kube-api-access-s9mdg\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.731178 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-config-data\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.736404 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-scripts\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.738274 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-config-data\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.740289 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.760547 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9mdg\" (UniqueName: \"kubernetes.io/projected/f1f3fda8-50b7-40be-9546-3d0c11bb1896-kube-api-access-s9mdg\") pod \"nova-cell0-conductor-db-sync-rvj6t\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:04 crc kubenswrapper[5127]: I0201 08:45:04.869820 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:05 crc kubenswrapper[5127]: I0201 08:45:05.164831 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvj6t"] Feb 01 08:45:05 crc kubenswrapper[5127]: I0201 08:45:05.355250 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" event={"ID":"f1f3fda8-50b7-40be-9546-3d0c11bb1896","Type":"ContainerStarted","Data":"16b4c980ebf094e8cae74a6608972107708d2fdafd915fc864f0146b5df3cb91"} Feb 01 08:45:06 crc kubenswrapper[5127]: I0201 08:45:06.248019 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3e04d54-d98a-4b53-85b8-70986f9336c0" path="/var/lib/kubelet/pods/c3e04d54-d98a-4b53-85b8-70986f9336c0/volumes" Feb 01 08:45:15 crc kubenswrapper[5127]: I0201 08:45:15.479792 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" event={"ID":"f1f3fda8-50b7-40be-9546-3d0c11bb1896","Type":"ContainerStarted","Data":"5ce192ab93b075c2f785499a1ea0f266ecaced66bd27756e215e70189c6caa43"} Feb 01 08:45:15 crc kubenswrapper[5127]: I0201 08:45:15.519482 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" podStartSLOduration=2.11076079 podStartE2EDuration="11.519454745s" podCreationTimestamp="2026-02-01 08:45:04 +0000 UTC" firstStartedPulling="2026-02-01 08:45:05.167132238 +0000 UTC m=+7055.653034601" lastFinishedPulling="2026-02-01 08:45:14.575826203 +0000 UTC m=+7065.061728556" observedRunningTime="2026-02-01 08:45:15.505956252 +0000 UTC m=+7065.991858625" watchObservedRunningTime="2026-02-01 08:45:15.519454745 +0000 UTC m=+7066.005357138" Feb 01 08:45:20 crc kubenswrapper[5127]: I0201 08:45:20.536511 5127 generic.go:334] "Generic (PLEG): container finished" podID="f1f3fda8-50b7-40be-9546-3d0c11bb1896" containerID="5ce192ab93b075c2f785499a1ea0f266ecaced66bd27756e215e70189c6caa43" exitCode=0 Feb 01 08:45:20 crc kubenswrapper[5127]: I0201 08:45:20.536627 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" event={"ID":"f1f3fda8-50b7-40be-9546-3d0c11bb1896","Type":"ContainerDied","Data":"5ce192ab93b075c2f785499a1ea0f266ecaced66bd27756e215e70189c6caa43"} Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.019821 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.192391 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-combined-ca-bundle\") pod \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.192690 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9mdg\" (UniqueName: \"kubernetes.io/projected/f1f3fda8-50b7-40be-9546-3d0c11bb1896-kube-api-access-s9mdg\") pod \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.192845 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-scripts\") pod \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.192882 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-config-data\") pod \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\" (UID: \"f1f3fda8-50b7-40be-9546-3d0c11bb1896\") " Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.199829 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-scripts" (OuterVolumeSpecName: "scripts") pod "f1f3fda8-50b7-40be-9546-3d0c11bb1896" (UID: "f1f3fda8-50b7-40be-9546-3d0c11bb1896"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.201886 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f3fda8-50b7-40be-9546-3d0c11bb1896-kube-api-access-s9mdg" (OuterVolumeSpecName: "kube-api-access-s9mdg") pod "f1f3fda8-50b7-40be-9546-3d0c11bb1896" (UID: "f1f3fda8-50b7-40be-9546-3d0c11bb1896"). InnerVolumeSpecName "kube-api-access-s9mdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.237283 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-config-data" (OuterVolumeSpecName: "config-data") pod "f1f3fda8-50b7-40be-9546-3d0c11bb1896" (UID: "f1f3fda8-50b7-40be-9546-3d0c11bb1896"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.245783 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1f3fda8-50b7-40be-9546-3d0c11bb1896" (UID: "f1f3fda8-50b7-40be-9546-3d0c11bb1896"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.296069 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9mdg\" (UniqueName: \"kubernetes.io/projected/f1f3fda8-50b7-40be-9546-3d0c11bb1896-kube-api-access-s9mdg\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.296127 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.296146 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.296163 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f3fda8-50b7-40be-9546-3d0c11bb1896-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.573525 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" event={"ID":"f1f3fda8-50b7-40be-9546-3d0c11bb1896","Type":"ContainerDied","Data":"16b4c980ebf094e8cae74a6608972107708d2fdafd915fc864f0146b5df3cb91"} Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.573637 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16b4c980ebf094e8cae74a6608972107708d2fdafd915fc864f0146b5df3cb91" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.573773 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvj6t" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.739814 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:45:22 crc kubenswrapper[5127]: E0201 08:45:22.740281 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f3fda8-50b7-40be-9546-3d0c11bb1896" containerName="nova-cell0-conductor-db-sync" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.740308 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f3fda8-50b7-40be-9546-3d0c11bb1896" containerName="nova-cell0-conductor-db-sync" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.740554 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f3fda8-50b7-40be-9546-3d0c11bb1896" containerName="nova-cell0-conductor-db-sync" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.741301 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.748781 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.751603 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-78zdj" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.757840 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.908297 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g4sx\" (UniqueName: \"kubernetes.io/projected/b90f8b5c-58e6-48e5-af62-2e0736a6895f-kube-api-access-6g4sx\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.908386 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:22 crc kubenswrapper[5127]: I0201 08:45:22.908472 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.010276 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.010462 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g4sx\" (UniqueName: \"kubernetes.io/projected/b90f8b5c-58e6-48e5-af62-2e0736a6895f-kube-api-access-6g4sx\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.010518 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.015036 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.015280 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.039905 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g4sx\" (UniqueName: \"kubernetes.io/projected/b90f8b5c-58e6-48e5-af62-2e0736a6895f-kube-api-access-6g4sx\") pod \"nova-cell0-conductor-0\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.055315 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.550396 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:45:23 crc kubenswrapper[5127]: W0201 08:45:23.566419 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb90f8b5c_58e6_48e5_af62_2e0736a6895f.slice/crio-67e29dd4b9a5479a7cad76234db849aeda07116ce2a476ae160da81df449f391 WatchSource:0}: Error finding container 67e29dd4b9a5479a7cad76234db849aeda07116ce2a476ae160da81df449f391: Status 404 returned error can't find the container with id 67e29dd4b9a5479a7cad76234db849aeda07116ce2a476ae160da81df449f391 Feb 01 08:45:23 crc kubenswrapper[5127]: I0201 08:45:23.584411 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b90f8b5c-58e6-48e5-af62-2e0736a6895f","Type":"ContainerStarted","Data":"67e29dd4b9a5479a7cad76234db849aeda07116ce2a476ae160da81df449f391"} Feb 01 08:45:24 crc kubenswrapper[5127]: I0201 08:45:24.596825 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b90f8b5c-58e6-48e5-af62-2e0736a6895f","Type":"ContainerStarted","Data":"39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049"} Feb 01 08:45:24 crc kubenswrapper[5127]: I0201 08:45:24.597496 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:24 crc kubenswrapper[5127]: I0201 08:45:24.631380 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.631352647 podStartE2EDuration="2.631352647s" podCreationTimestamp="2026-02-01 08:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:24.616494597 +0000 UTC m=+7075.102396960" watchObservedRunningTime="2026-02-01 08:45:24.631352647 +0000 UTC m=+7075.117255050" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.083544 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.562980 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lwm8n"] Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.564243 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.566287 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.577137 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.621860 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lwm8n"] Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.735138 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkgx\" (UniqueName: \"kubernetes.io/projected/a0400c1f-002e-434f-b5b6-b7fec9bca69c-kube-api-access-6dkgx\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.735287 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.735377 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-scripts\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.735428 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-config-data\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.765182 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.771015 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.792347 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.800247 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837307 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837359 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-scripts\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837392 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-config-data\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837504 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02bd61-646f-477b-bcee-e2153e570519-logs\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837572 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkgx\" (UniqueName: \"kubernetes.io/projected/a0400c1f-002e-434f-b5b6-b7fec9bca69c-kube-api-access-6dkgx\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837625 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-config-data\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837654 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.837724 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htn48\" (UniqueName: \"kubernetes.io/projected/eb02bd61-646f-477b-bcee-e2153e570519-kube-api-access-htn48\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.839036 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.840641 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.844288 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-scripts\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.849905 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-config-data\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.850795 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.855135 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.867462 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.876530 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkgx\" (UniqueName: \"kubernetes.io/projected/a0400c1f-002e-434f-b5b6-b7fec9bca69c-kube-api-access-6dkgx\") pod \"nova-cell0-cell-mapping-lwm8n\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.930007 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.947394 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02bd61-646f-477b-bcee-e2153e570519-logs\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.947461 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-config-data\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.947482 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s6n7\" (UniqueName: \"kubernetes.io/projected/de093257-3d61-4df8-86af-1000c1964ff3-kube-api-access-8s6n7\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.947504 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.947550 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htn48\" (UniqueName: \"kubernetes.io/projected/eb02bd61-646f-477b-bcee-e2153e570519-kube-api-access-htn48\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.947591 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.947611 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.948027 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02bd61-646f-477b-bcee-e2153e570519-logs\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.963839 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-config-data\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.965938 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.969452 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.971428 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.985113 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 08:45:28 crc kubenswrapper[5127]: I0201 08:45:28.999641 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.025292 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htn48\" (UniqueName: \"kubernetes.io/projected/eb02bd61-646f-477b-bcee-e2153e570519-kube-api-access-htn48\") pod \"nova-api-0\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " pod="openstack/nova-api-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.025518 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.035786 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050033 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-config-data\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050118 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050205 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050233 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbhb\" (UniqueName: \"kubernetes.io/projected/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-kube-api-access-gqbhb\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050300 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050392 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa2f0596-3e22-45b9-a55c-470187b3f661-logs\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050647 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.050999 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.051356 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-config-data\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.051421 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh27m\" (UniqueName: \"kubernetes.io/projected/fa2f0596-3e22-45b9-a55c-470187b3f661-kube-api-access-dh27m\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.051760 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s6n7\" (UniqueName: \"kubernetes.io/projected/de093257-3d61-4df8-86af-1000c1964ff3-kube-api-access-8s6n7\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.060334 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.061794 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.068976 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.095197 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.098435 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s6n7\" (UniqueName: \"kubernetes.io/projected/de093257-3d61-4df8-86af-1000c1964ff3-kube-api-access-8s6n7\") pod \"nova-cell1-novncproxy-0\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.110213 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fc8b578c-v5nfr"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.114471 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.129310 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.130496 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fc8b578c-v5nfr"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.153832 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-dns-svc\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.153865 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-nb\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.153897 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-config-data\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154035 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh27m\" (UniqueName: \"kubernetes.io/projected/fa2f0596-3e22-45b9-a55c-470187b3f661-kube-api-access-dh27m\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154108 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-config-data\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154132 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154168 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-config\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154199 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbhb\" (UniqueName: \"kubernetes.io/projected/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-kube-api-access-gqbhb\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154231 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-sb\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154266 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6vh9\" (UniqueName: \"kubernetes.io/projected/c71f4d2e-f115-44a7-bd74-aa0104e156ab-kube-api-access-l6vh9\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154287 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa2f0596-3e22-45b9-a55c-470187b3f661-logs\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.154310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.162896 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa2f0596-3e22-45b9-a55c-470187b3f661-logs\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.165540 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.166522 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.171741 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-config-data\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.182492 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-config-data\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.190098 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbhb\" (UniqueName: \"kubernetes.io/projected/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-kube-api-access-gqbhb\") pod \"nova-scheduler-0\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.192276 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh27m\" (UniqueName: \"kubernetes.io/projected/fa2f0596-3e22-45b9-a55c-470187b3f661-kube-api-access-dh27m\") pod \"nova-metadata-0\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.199653 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.255772 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-dns-svc\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.255809 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-nb\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.255988 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-config\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.256029 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-sb\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.256064 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6vh9\" (UniqueName: \"kubernetes.io/projected/c71f4d2e-f115-44a7-bd74-aa0104e156ab-kube-api-access-l6vh9\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.257925 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-nb\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.258166 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-dns-svc\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.259262 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-sb\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.259467 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-config\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.278403 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6vh9\" (UniqueName: \"kubernetes.io/projected/c71f4d2e-f115-44a7-bd74-aa0104e156ab-kube-api-access-l6vh9\") pod \"dnsmasq-dns-55fc8b578c-v5nfr\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.490680 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.531366 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.610978 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lwm8n"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.664706 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9bll9"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.667849 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.671148 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.671349 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.678661 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lwm8n" event={"ID":"a0400c1f-002e-434f-b5b6-b7fec9bca69c","Type":"ContainerStarted","Data":"fa8a0947c91c0df857140df550bf9aea426770a3b3fb83770ab480f4b48a5016"} Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.679124 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9bll9"] Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.765184 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-scripts\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.765234 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68jbd\" (UniqueName: \"kubernetes.io/projected/115ce664-8818-445d-8923-aa37f1ea49f6-kube-api-access-68jbd\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.765290 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.765512 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-config-data\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.866858 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.867178 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-config-data\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.867324 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-scripts\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.867412 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68jbd\" (UniqueName: \"kubernetes.io/projected/115ce664-8818-445d-8923-aa37f1ea49f6-kube-api-access-68jbd\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.870183 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-scripts\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.870627 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.870884 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-config-data\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.883245 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68jbd\" (UniqueName: \"kubernetes.io/projected/115ce664-8818-445d-8923-aa37f1ea49f6-kube-api-access-68jbd\") pod \"nova-cell1-conductor-db-sync-9bll9\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:29 crc kubenswrapper[5127]: I0201 08:45:29.987905 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.130076 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.139948 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:45:30 crc kubenswrapper[5127]: W0201 08:45:30.144149 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa2f0596_3e22_45b9_a55c_470187b3f661.slice/crio-17a907f9f5beee33e26fdc7449392103d0e7c2d2d7587cb46225b038ce4d9ded WatchSource:0}: Error finding container 17a907f9f5beee33e26fdc7449392103d0e7c2d2d7587cb46225b038ce4d9ded: Status 404 returned error can't find the container with id 17a907f9f5beee33e26fdc7449392103d0e7c2d2d7587cb46225b038ce4d9ded Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.150152 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.512069 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.688854 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9bll9"] Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.702923 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f309b19a-5193-4b77-93e0-e0fcc06e2dbc","Type":"ContainerStarted","Data":"ac229484892598d3c21cdc940bfa5b7daa08e62584b966da7ea372ce25aa6e26"} Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.703090 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fc8b578c-v5nfr"] Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.705399 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lwm8n" event={"ID":"a0400c1f-002e-434f-b5b6-b7fec9bca69c","Type":"ContainerStarted","Data":"62b6a8c25fc0540e522e04e13dfd46a0557f953c842b917df6abbcc13d3e0c6d"} Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.707068 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb02bd61-646f-477b-bcee-e2153e570519","Type":"ContainerStarted","Data":"5027b3f18b40e7c27c277a6c6e3c72c499d8f1f0492131945202d24cc7afed5f"} Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.709050 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de093257-3d61-4df8-86af-1000c1964ff3","Type":"ContainerStarted","Data":"2b76dd1c63e8192dd56248645321696e6f8eab7ff20edb8cb47c5fe1734e4be1"} Feb 01 08:45:30 crc kubenswrapper[5127]: W0201 08:45:30.710671 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod115ce664_8818_445d_8923_aa37f1ea49f6.slice/crio-eadf29d51dc1aa2390204c2cbc135d3c231441ee04bdedb3c0830b2c12eb90da WatchSource:0}: Error finding container eadf29d51dc1aa2390204c2cbc135d3c231441ee04bdedb3c0830b2c12eb90da: Status 404 returned error can't find the container with id eadf29d51dc1aa2390204c2cbc135d3c231441ee04bdedb3c0830b2c12eb90da Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.710759 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa2f0596-3e22-45b9-a55c-470187b3f661","Type":"ContainerStarted","Data":"17a907f9f5beee33e26fdc7449392103d0e7c2d2d7587cb46225b038ce4d9ded"} Feb 01 08:45:30 crc kubenswrapper[5127]: W0201 08:45:30.715946 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc71f4d2e_f115_44a7_bd74_aa0104e156ab.slice/crio-0216ec29c0c158afb49f9283d8d66c455ddfd81a7979e69648aaff4a0897a0eb WatchSource:0}: Error finding container 0216ec29c0c158afb49f9283d8d66c455ddfd81a7979e69648aaff4a0897a0eb: Status 404 returned error can't find the container with id 0216ec29c0c158afb49f9283d8d66c455ddfd81a7979e69648aaff4a0897a0eb Feb 01 08:45:30 crc kubenswrapper[5127]: I0201 08:45:30.724589 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lwm8n" podStartSLOduration=2.724559969 podStartE2EDuration="2.724559969s" podCreationTimestamp="2026-02-01 08:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:30.717039337 +0000 UTC m=+7081.202941700" watchObservedRunningTime="2026-02-01 08:45:30.724559969 +0000 UTC m=+7081.210462332" Feb 01 08:45:31 crc kubenswrapper[5127]: I0201 08:45:31.723477 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9bll9" event={"ID":"115ce664-8818-445d-8923-aa37f1ea49f6","Type":"ContainerStarted","Data":"92e7ceda27d9704f7f6946ff56e623299cf9c96278bb0dc5e98bd560a309c0b8"} Feb 01 08:45:31 crc kubenswrapper[5127]: I0201 08:45:31.723899 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9bll9" event={"ID":"115ce664-8818-445d-8923-aa37f1ea49f6","Type":"ContainerStarted","Data":"eadf29d51dc1aa2390204c2cbc135d3c231441ee04bdedb3c0830b2c12eb90da"} Feb 01 08:45:31 crc kubenswrapper[5127]: I0201 08:45:31.727772 5127 generic.go:334] "Generic (PLEG): container finished" podID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerID="b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31" exitCode=0 Feb 01 08:45:31 crc kubenswrapper[5127]: I0201 08:45:31.729564 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" event={"ID":"c71f4d2e-f115-44a7-bd74-aa0104e156ab","Type":"ContainerDied","Data":"b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31"} Feb 01 08:45:31 crc kubenswrapper[5127]: I0201 08:45:31.729612 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" event={"ID":"c71f4d2e-f115-44a7-bd74-aa0104e156ab","Type":"ContainerStarted","Data":"0216ec29c0c158afb49f9283d8d66c455ddfd81a7979e69648aaff4a0897a0eb"} Feb 01 08:45:31 crc kubenswrapper[5127]: I0201 08:45:31.754345 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9bll9" podStartSLOduration=2.754323434 podStartE2EDuration="2.754323434s" podCreationTimestamp="2026-02-01 08:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:31.747845 +0000 UTC m=+7082.233747363" watchObservedRunningTime="2026-02-01 08:45:31.754323434 +0000 UTC m=+7082.240225797" Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.747173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de093257-3d61-4df8-86af-1000c1964ff3","Type":"ContainerStarted","Data":"01e7a40c09a8d30251a233d1c1a65779b4fdcc33891f92d5910592c994dcc59d"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.751207 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" event={"ID":"c71f4d2e-f115-44a7-bd74-aa0104e156ab","Type":"ContainerStarted","Data":"d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.751771 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.752969 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa2f0596-3e22-45b9-a55c-470187b3f661","Type":"ContainerStarted","Data":"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.753027 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa2f0596-3e22-45b9-a55c-470187b3f661","Type":"ContainerStarted","Data":"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.754728 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f309b19a-5193-4b77-93e0-e0fcc06e2dbc","Type":"ContainerStarted","Data":"c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.756395 5127 generic.go:334] "Generic (PLEG): container finished" podID="115ce664-8818-445d-8923-aa37f1ea49f6" containerID="92e7ceda27d9704f7f6946ff56e623299cf9c96278bb0dc5e98bd560a309c0b8" exitCode=0 Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.756460 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9bll9" event={"ID":"115ce664-8818-445d-8923-aa37f1ea49f6","Type":"ContainerDied","Data":"92e7ceda27d9704f7f6946ff56e623299cf9c96278bb0dc5e98bd560a309c0b8"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.758474 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb02bd61-646f-477b-bcee-e2153e570519","Type":"ContainerStarted","Data":"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.758519 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb02bd61-646f-477b-bcee-e2153e570519","Type":"ContainerStarted","Data":"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750"} Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.795367 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.09941773 podStartE2EDuration="5.795334249s" podCreationTimestamp="2026-02-01 08:45:28 +0000 UTC" firstStartedPulling="2026-02-01 08:45:30.19753918 +0000 UTC m=+7080.683441543" lastFinishedPulling="2026-02-01 08:45:32.893455699 +0000 UTC m=+7083.379358062" observedRunningTime="2026-02-01 08:45:33.780909461 +0000 UTC m=+7084.266811824" watchObservedRunningTime="2026-02-01 08:45:33.795334249 +0000 UTC m=+7084.281236612" Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.805372 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.109918491 podStartE2EDuration="5.805354027s" podCreationTimestamp="2026-02-01 08:45:28 +0000 UTC" firstStartedPulling="2026-02-01 08:45:30.198017483 +0000 UTC m=+7080.683919846" lastFinishedPulling="2026-02-01 08:45:32.893453009 +0000 UTC m=+7083.379355382" observedRunningTime="2026-02-01 08:45:33.801951076 +0000 UTC m=+7084.287853439" watchObservedRunningTime="2026-02-01 08:45:33.805354027 +0000 UTC m=+7084.291256390" Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.838608 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.110256401 podStartE2EDuration="5.8385664s" podCreationTimestamp="2026-02-01 08:45:28 +0000 UTC" firstStartedPulling="2026-02-01 08:45:30.167881173 +0000 UTC m=+7080.653783536" lastFinishedPulling="2026-02-01 08:45:32.896191172 +0000 UTC m=+7083.382093535" observedRunningTime="2026-02-01 08:45:33.823202437 +0000 UTC m=+7084.309104810" watchObservedRunningTime="2026-02-01 08:45:33.8385664 +0000 UTC m=+7084.324468763" Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.918763 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" podStartSLOduration=4.918744114 podStartE2EDuration="4.918744114s" podCreationTimestamp="2026-02-01 08:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:33.896201599 +0000 UTC m=+7084.382103962" watchObservedRunningTime="2026-02-01 08:45:33.918744114 +0000 UTC m=+7084.404646477" Feb 01 08:45:33 crc kubenswrapper[5127]: I0201 08:45:33.920899 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.577890345 podStartE2EDuration="5.920890892s" podCreationTimestamp="2026-02-01 08:45:28 +0000 UTC" firstStartedPulling="2026-02-01 08:45:30.551483399 +0000 UTC m=+7081.037385762" lastFinishedPulling="2026-02-01 08:45:32.894483936 +0000 UTC m=+7083.380386309" observedRunningTime="2026-02-01 08:45:33.91038835 +0000 UTC m=+7084.396290713" watchObservedRunningTime="2026-02-01 08:45:33.920890892 +0000 UTC m=+7084.406793255" Feb 01 08:45:34 crc kubenswrapper[5127]: I0201 08:45:34.129966 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:34 crc kubenswrapper[5127]: I0201 08:45:34.201660 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:45:34 crc kubenswrapper[5127]: I0201 08:45:34.201708 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:45:34 crc kubenswrapper[5127]: I0201 08:45:34.492013 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.122285 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.318817 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68jbd\" (UniqueName: \"kubernetes.io/projected/115ce664-8818-445d-8923-aa37f1ea49f6-kube-api-access-68jbd\") pod \"115ce664-8818-445d-8923-aa37f1ea49f6\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.318945 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-combined-ca-bundle\") pod \"115ce664-8818-445d-8923-aa37f1ea49f6\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.319112 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-scripts\") pod \"115ce664-8818-445d-8923-aa37f1ea49f6\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.319194 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-config-data\") pod \"115ce664-8818-445d-8923-aa37f1ea49f6\" (UID: \"115ce664-8818-445d-8923-aa37f1ea49f6\") " Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.326935 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-scripts" (OuterVolumeSpecName: "scripts") pod "115ce664-8818-445d-8923-aa37f1ea49f6" (UID: "115ce664-8818-445d-8923-aa37f1ea49f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.327339 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/115ce664-8818-445d-8923-aa37f1ea49f6-kube-api-access-68jbd" (OuterVolumeSpecName: "kube-api-access-68jbd") pod "115ce664-8818-445d-8923-aa37f1ea49f6" (UID: "115ce664-8818-445d-8923-aa37f1ea49f6"). InnerVolumeSpecName "kube-api-access-68jbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.368163 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-config-data" (OuterVolumeSpecName: "config-data") pod "115ce664-8818-445d-8923-aa37f1ea49f6" (UID: "115ce664-8818-445d-8923-aa37f1ea49f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.368779 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "115ce664-8818-445d-8923-aa37f1ea49f6" (UID: "115ce664-8818-445d-8923-aa37f1ea49f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.421744 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.422482 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.422600 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68jbd\" (UniqueName: \"kubernetes.io/projected/115ce664-8818-445d-8923-aa37f1ea49f6-kube-api-access-68jbd\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.422683 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/115ce664-8818-445d-8923-aa37f1ea49f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.780611 5127 generic.go:334] "Generic (PLEG): container finished" podID="a0400c1f-002e-434f-b5b6-b7fec9bca69c" containerID="62b6a8c25fc0540e522e04e13dfd46a0557f953c842b917df6abbcc13d3e0c6d" exitCode=0 Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.780687 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lwm8n" event={"ID":"a0400c1f-002e-434f-b5b6-b7fec9bca69c","Type":"ContainerDied","Data":"62b6a8c25fc0540e522e04e13dfd46a0557f953c842b917df6abbcc13d3e0c6d"} Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.785176 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9bll9" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.785167 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9bll9" event={"ID":"115ce664-8818-445d-8923-aa37f1ea49f6","Type":"ContainerDied","Data":"eadf29d51dc1aa2390204c2cbc135d3c231441ee04bdedb3c0830b2c12eb90da"} Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.785253 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eadf29d51dc1aa2390204c2cbc135d3c231441ee04bdedb3c0830b2c12eb90da" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.930527 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:45:35 crc kubenswrapper[5127]: E0201 08:45:35.932242 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115ce664-8818-445d-8923-aa37f1ea49f6" containerName="nova-cell1-conductor-db-sync" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.932288 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="115ce664-8818-445d-8923-aa37f1ea49f6" containerName="nova-cell1-conductor-db-sync" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.932803 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="115ce664-8818-445d-8923-aa37f1ea49f6" containerName="nova-cell1-conductor-db-sync" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.934026 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.936073 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 08:45:35 crc kubenswrapper[5127]: I0201 08:45:35.949430 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.134833 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.134954 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.135121 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v7mn\" (UniqueName: \"kubernetes.io/projected/89951ae8-f890-4ce2-9146-fed7435253c5-kube-api-access-8v7mn\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.237272 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v7mn\" (UniqueName: \"kubernetes.io/projected/89951ae8-f890-4ce2-9146-fed7435253c5-kube-api-access-8v7mn\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.237451 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.237628 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.244102 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.244184 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.261148 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v7mn\" (UniqueName: \"kubernetes.io/projected/89951ae8-f890-4ce2-9146-fed7435253c5-kube-api-access-8v7mn\") pod \"nova-cell1-conductor-0\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:36 crc kubenswrapper[5127]: I0201 08:45:36.551084 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.114513 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:45:37 crc kubenswrapper[5127]: W0201 08:45:37.118533 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89951ae8_f890_4ce2_9146_fed7435253c5.slice/crio-ce8be4eb6e4cf5d5aaa559ad891d00fd28c6b9b2087d58de77dc2b36ced0bf17 WatchSource:0}: Error finding container ce8be4eb6e4cf5d5aaa559ad891d00fd28c6b9b2087d58de77dc2b36ced0bf17: Status 404 returned error can't find the container with id ce8be4eb6e4cf5d5aaa559ad891d00fd28c6b9b2087d58de77dc2b36ced0bf17 Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.179111 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.358425 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-config-data\") pod \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.358772 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dkgx\" (UniqueName: \"kubernetes.io/projected/a0400c1f-002e-434f-b5b6-b7fec9bca69c-kube-api-access-6dkgx\") pod \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.358976 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-scripts\") pod \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.359017 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-combined-ca-bundle\") pod \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\" (UID: \"a0400c1f-002e-434f-b5b6-b7fec9bca69c\") " Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.361990 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-scripts" (OuterVolumeSpecName: "scripts") pod "a0400c1f-002e-434f-b5b6-b7fec9bca69c" (UID: "a0400c1f-002e-434f-b5b6-b7fec9bca69c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.364467 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0400c1f-002e-434f-b5b6-b7fec9bca69c-kube-api-access-6dkgx" (OuterVolumeSpecName: "kube-api-access-6dkgx") pod "a0400c1f-002e-434f-b5b6-b7fec9bca69c" (UID: "a0400c1f-002e-434f-b5b6-b7fec9bca69c"). InnerVolumeSpecName "kube-api-access-6dkgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.383660 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-config-data" (OuterVolumeSpecName: "config-data") pod "a0400c1f-002e-434f-b5b6-b7fec9bca69c" (UID: "a0400c1f-002e-434f-b5b6-b7fec9bca69c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.384832 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0400c1f-002e-434f-b5b6-b7fec9bca69c" (UID: "a0400c1f-002e-434f-b5b6-b7fec9bca69c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.461388 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.461419 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dkgx\" (UniqueName: \"kubernetes.io/projected/a0400c1f-002e-434f-b5b6-b7fec9bca69c-kube-api-access-6dkgx\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.461429 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.461438 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0400c1f-002e-434f-b5b6-b7fec9bca69c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.839317 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lwm8n" event={"ID":"a0400c1f-002e-434f-b5b6-b7fec9bca69c","Type":"ContainerDied","Data":"fa8a0947c91c0df857140df550bf9aea426770a3b3fb83770ab480f4b48a5016"} Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.839374 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa8a0947c91c0df857140df550bf9aea426770a3b3fb83770ab480f4b48a5016" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.839445 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lwm8n" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.843384 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"89951ae8-f890-4ce2-9146-fed7435253c5","Type":"ContainerStarted","Data":"6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6"} Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.843413 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"89951ae8-f890-4ce2-9146-fed7435253c5","Type":"ContainerStarted","Data":"ce8be4eb6e4cf5d5aaa559ad891d00fd28c6b9b2087d58de77dc2b36ced0bf17"} Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.843682 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:37 crc kubenswrapper[5127]: I0201 08:45:37.869756 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.869732312 podStartE2EDuration="2.869732312s" podCreationTimestamp="2026-02-01 08:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:37.868259682 +0000 UTC m=+7088.354162085" watchObservedRunningTime="2026-02-01 08:45:37.869732312 +0000 UTC m=+7088.355634715" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.013468 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.013862 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-log" containerID="cri-o://61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750" gracePeriod=30 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.014198 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-api" containerID="cri-o://8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd" gracePeriod=30 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.044866 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.045186 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f309b19a-5193-4b77-93e0-e0fcc06e2dbc" containerName="nova-scheduler-scheduler" containerID="cri-o://c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50" gracePeriod=30 Feb 01 08:45:38 crc kubenswrapper[5127]: E0201 08:45:38.045817 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0400c1f_002e_434f_b5b6_b7fec9bca69c.slice/crio-fa8a0947c91c0df857140df550bf9aea426770a3b3fb83770ab480f4b48a5016\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0400c1f_002e_434f_b5b6_b7fec9bca69c.slice\": RecentStats: unable to find data in memory cache]" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.054861 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.055086 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-log" containerID="cri-o://e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4" gracePeriod=30 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.055249 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-metadata" containerID="cri-o://b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2" gracePeriod=30 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.649870 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.725037 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.802748 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02bd61-646f-477b-bcee-e2153e570519-logs\") pod \"eb02bd61-646f-477b-bcee-e2153e570519\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.802802 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-config-data\") pod \"eb02bd61-646f-477b-bcee-e2153e570519\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.802827 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htn48\" (UniqueName: \"kubernetes.io/projected/eb02bd61-646f-477b-bcee-e2153e570519-kube-api-access-htn48\") pod \"eb02bd61-646f-477b-bcee-e2153e570519\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.803025 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-combined-ca-bundle\") pod \"eb02bd61-646f-477b-bcee-e2153e570519\" (UID: \"eb02bd61-646f-477b-bcee-e2153e570519\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.803501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb02bd61-646f-477b-bcee-e2153e570519-logs" (OuterVolumeSpecName: "logs") pod "eb02bd61-646f-477b-bcee-e2153e570519" (UID: "eb02bd61-646f-477b-bcee-e2153e570519"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.808884 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb02bd61-646f-477b-bcee-e2153e570519-kube-api-access-htn48" (OuterVolumeSpecName: "kube-api-access-htn48") pod "eb02bd61-646f-477b-bcee-e2153e570519" (UID: "eb02bd61-646f-477b-bcee-e2153e570519"). InnerVolumeSpecName "kube-api-access-htn48". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.825433 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb02bd61-646f-477b-bcee-e2153e570519" (UID: "eb02bd61-646f-477b-bcee-e2153e570519"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.826087 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-config-data" (OuterVolumeSpecName: "config-data") pod "eb02bd61-646f-477b-bcee-e2153e570519" (UID: "eb02bd61-646f-477b-bcee-e2153e570519"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.857973 5127 generic.go:334] "Generic (PLEG): container finished" podID="eb02bd61-646f-477b-bcee-e2153e570519" containerID="8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd" exitCode=0 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.858003 5127 generic.go:334] "Generic (PLEG): container finished" podID="eb02bd61-646f-477b-bcee-e2153e570519" containerID="61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750" exitCode=143 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.858036 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb02bd61-646f-477b-bcee-e2153e570519","Type":"ContainerDied","Data":"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd"} Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.858062 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb02bd61-646f-477b-bcee-e2153e570519","Type":"ContainerDied","Data":"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750"} Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.858071 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"eb02bd61-646f-477b-bcee-e2153e570519","Type":"ContainerDied","Data":"5027b3f18b40e7c27c277a6c6e3c72c499d8f1f0492131945202d24cc7afed5f"} Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.858085 5127 scope.go:117] "RemoveContainer" containerID="8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.858190 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.867411 5127 generic.go:334] "Generic (PLEG): container finished" podID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerID="b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2" exitCode=0 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.867449 5127 generic.go:334] "Generic (PLEG): container finished" podID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerID="e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4" exitCode=143 Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.867472 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa2f0596-3e22-45b9-a55c-470187b3f661","Type":"ContainerDied","Data":"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2"} Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.867542 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa2f0596-3e22-45b9-a55c-470187b3f661","Type":"ContainerDied","Data":"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4"} Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.867564 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fa2f0596-3e22-45b9-a55c-470187b3f661","Type":"ContainerDied","Data":"17a907f9f5beee33e26fdc7449392103d0e7c2d2d7587cb46225b038ce4d9ded"} Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.867497 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.895033 5127 scope.go:117] "RemoveContainer" containerID="61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.906250 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh27m\" (UniqueName: \"kubernetes.io/projected/fa2f0596-3e22-45b9-a55c-470187b3f661-kube-api-access-dh27m\") pod \"fa2f0596-3e22-45b9-a55c-470187b3f661\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.906745 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-config-data\") pod \"fa2f0596-3e22-45b9-a55c-470187b3f661\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.906804 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa2f0596-3e22-45b9-a55c-470187b3f661-logs\") pod \"fa2f0596-3e22-45b9-a55c-470187b3f661\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.906908 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-combined-ca-bundle\") pod \"fa2f0596-3e22-45b9-a55c-470187b3f661\" (UID: \"fa2f0596-3e22-45b9-a55c-470187b3f661\") " Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.907387 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.907404 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb02bd61-646f-477b-bcee-e2153e570519-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.907414 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb02bd61-646f-477b-bcee-e2153e570519-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.907423 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htn48\" (UniqueName: \"kubernetes.io/projected/eb02bd61-646f-477b-bcee-e2153e570519-kube-api-access-htn48\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.907456 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2f0596-3e22-45b9-a55c-470187b3f661-logs" (OuterVolumeSpecName: "logs") pod "fa2f0596-3e22-45b9-a55c-470187b3f661" (UID: "fa2f0596-3e22-45b9-a55c-470187b3f661"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.914216 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.920733 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2f0596-3e22-45b9-a55c-470187b3f661-kube-api-access-dh27m" (OuterVolumeSpecName: "kube-api-access-dh27m") pod "fa2f0596-3e22-45b9-a55c-470187b3f661" (UID: "fa2f0596-3e22-45b9-a55c-470187b3f661"). InnerVolumeSpecName "kube-api-access-dh27m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.927357 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.946714 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:38 crc kubenswrapper[5127]: E0201 08:45:38.947136 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0400c1f-002e-434f-b5b6-b7fec9bca69c" containerName="nova-manage" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947149 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0400c1f-002e-434f-b5b6-b7fec9bca69c" containerName="nova-manage" Feb 01 08:45:38 crc kubenswrapper[5127]: E0201 08:45:38.947165 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-api" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947171 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-api" Feb 01 08:45:38 crc kubenswrapper[5127]: E0201 08:45:38.947183 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-log" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947190 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-log" Feb 01 08:45:38 crc kubenswrapper[5127]: E0201 08:45:38.947213 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-metadata" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947219 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-metadata" Feb 01 08:45:38 crc kubenswrapper[5127]: E0201 08:45:38.947234 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-log" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947243 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-log" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947417 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-metadata" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947428 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-api" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947439 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb02bd61-646f-477b-bcee-e2153e570519" containerName="nova-api-log" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947446 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0400c1f-002e-434f-b5b6-b7fec9bca69c" containerName="nova-manage" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.947458 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" containerName="nova-metadata-log" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.948426 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.950900 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.958918 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.971948 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-config-data" (OuterVolumeSpecName: "config-data") pod "fa2f0596-3e22-45b9-a55c-470187b3f661" (UID: "fa2f0596-3e22-45b9-a55c-470187b3f661"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:38 crc kubenswrapper[5127]: I0201 08:45:38.980028 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa2f0596-3e22-45b9-a55c-470187b3f661" (UID: "fa2f0596-3e22-45b9-a55c-470187b3f661"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.012939 5127 scope.go:117] "RemoveContainer" containerID="8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd" Feb 01 08:45:39 crc kubenswrapper[5127]: E0201 08:45:39.013474 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd\": container with ID starting with 8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd not found: ID does not exist" containerID="8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.013504 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd"} err="failed to get container status \"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd\": rpc error: code = NotFound desc = could not find container \"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd\": container with ID starting with 8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.013527 5127 scope.go:117] "RemoveContainer" containerID="61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750" Feb 01 08:45:39 crc kubenswrapper[5127]: E0201 08:45:39.014226 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750\": container with ID starting with 61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750 not found: ID does not exist" containerID="61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.014292 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750"} err="failed to get container status \"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750\": rpc error: code = NotFound desc = could not find container \"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750\": container with ID starting with 61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750 not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.014325 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh27m\" (UniqueName: \"kubernetes.io/projected/fa2f0596-3e22-45b9-a55c-470187b3f661-kube-api-access-dh27m\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.014348 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.014361 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa2f0596-3e22-45b9-a55c-470187b3f661-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.014375 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f0596-3e22-45b9-a55c-470187b3f661-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.014330 5127 scope.go:117] "RemoveContainer" containerID="8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.014981 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd"} err="failed to get container status \"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd\": rpc error: code = NotFound desc = could not find container \"8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd\": container with ID starting with 8935f7e4c3ea3ca59043009ed3296ff276416bb59eb25bd8b89b32d3638db2dd not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.015010 5127 scope.go:117] "RemoveContainer" containerID="61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.015515 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750"} err="failed to get container status \"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750\": rpc error: code = NotFound desc = could not find container \"61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750\": container with ID starting with 61669f6e688713f15b3ec734aa316b7ed4cb3ac2a45740989e779e345015c750 not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.015573 5127 scope.go:117] "RemoveContainer" containerID="b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.039614 5127 scope.go:117] "RemoveContainer" containerID="e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.064189 5127 scope.go:117] "RemoveContainer" containerID="b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2" Feb 01 08:45:39 crc kubenswrapper[5127]: E0201 08:45:39.064570 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2\": container with ID starting with b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2 not found: ID does not exist" containerID="b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.064619 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2"} err="failed to get container status \"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2\": rpc error: code = NotFound desc = could not find container \"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2\": container with ID starting with b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2 not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.064640 5127 scope.go:117] "RemoveContainer" containerID="e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4" Feb 01 08:45:39 crc kubenswrapper[5127]: E0201 08:45:39.064817 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4\": container with ID starting with e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4 not found: ID does not exist" containerID="e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.064832 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4"} err="failed to get container status \"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4\": rpc error: code = NotFound desc = could not find container \"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4\": container with ID starting with e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4 not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.064843 5127 scope.go:117] "RemoveContainer" containerID="b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.065019 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2"} err="failed to get container status \"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2\": rpc error: code = NotFound desc = could not find container \"b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2\": container with ID starting with b5f75ef4b08466e1114fc8b02a5cc131f101a9d542723cb657561004c7c04cc2 not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.065032 5127 scope.go:117] "RemoveContainer" containerID="e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.065442 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4"} err="failed to get container status \"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4\": rpc error: code = NotFound desc = could not find container \"e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4\": container with ID starting with e5934672893793ddf78c14b3ec0486be825ad110e6aea5052800e3621eb688d4 not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.116316 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlkhn\" (UniqueName: \"kubernetes.io/projected/0e06c7f5-77f6-4f8f-b335-891972280668-kube-api-access-mlkhn\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.117312 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e06c7f5-77f6-4f8f-b335-891972280668-logs\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.117730 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.117986 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-config-data\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.130726 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.169022 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.219923 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlkhn\" (UniqueName: \"kubernetes.io/projected/0e06c7f5-77f6-4f8f-b335-891972280668-kube-api-access-mlkhn\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.219962 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e06c7f5-77f6-4f8f-b335-891972280668-logs\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.220030 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.220106 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-config-data\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.222020 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e06c7f5-77f6-4f8f-b335-891972280668-logs\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.232703 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-config-data\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.239215 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.242731 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.249070 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlkhn\" (UniqueName: \"kubernetes.io/projected/0e06c7f5-77f6-4f8f-b335-891972280668-kube-api-access-mlkhn\") pod \"nova-api-0\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.254832 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.264455 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.266187 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.268913 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.273693 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.305250 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.322080 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b171b2-8077-4c94-89ae-76b74046161b-logs\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.322150 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-config-data\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.322222 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.322255 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfb9g\" (UniqueName: \"kubernetes.io/projected/88b171b2-8077-4c94-89ae-76b74046161b-kube-api-access-wfb9g\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.415615 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.423248 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-combined-ca-bundle\") pod \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.423397 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b171b2-8077-4c94-89ae-76b74046161b-logs\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.423447 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-config-data\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.423496 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.423538 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfb9g\" (UniqueName: \"kubernetes.io/projected/88b171b2-8077-4c94-89ae-76b74046161b-kube-api-access-wfb9g\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.424082 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b171b2-8077-4c94-89ae-76b74046161b-logs\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.432115 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.432932 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-config-data\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.444065 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfb9g\" (UniqueName: \"kubernetes.io/projected/88b171b2-8077-4c94-89ae-76b74046161b-kube-api-access-wfb9g\") pod \"nova-metadata-0\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.475434 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f309b19a-5193-4b77-93e0-e0fcc06e2dbc" (UID: "f309b19a-5193-4b77-93e0-e0fcc06e2dbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.527594 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbhb\" (UniqueName: \"kubernetes.io/projected/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-kube-api-access-gqbhb\") pod \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.527657 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-config-data\") pod \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\" (UID: \"f309b19a-5193-4b77-93e0-e0fcc06e2dbc\") " Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.528324 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.530870 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-kube-api-access-gqbhb" (OuterVolumeSpecName: "kube-api-access-gqbhb") pod "f309b19a-5193-4b77-93e0-e0fcc06e2dbc" (UID: "f309b19a-5193-4b77-93e0-e0fcc06e2dbc"). InnerVolumeSpecName "kube-api-access-gqbhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.534748 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.555191 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-config-data" (OuterVolumeSpecName: "config-data") pod "f309b19a-5193-4b77-93e0-e0fcc06e2dbc" (UID: "f309b19a-5193-4b77-93e0-e0fcc06e2dbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.609682 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b55d5755-jzw5k"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.609927 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" podUID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerName="dnsmasq-dns" containerID="cri-o://03e6a4c7e6336369aa3fa6cb3c84e5fddce779cfc6a3c791cbf95ddc761dcd00" gracePeriod=10 Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.632067 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqbhb\" (UniqueName: \"kubernetes.io/projected/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-kube-api-access-gqbhb\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.632101 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f309b19a-5193-4b77-93e0-e0fcc06e2dbc-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.730960 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.749896 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.879377 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e06c7f5-77f6-4f8f-b335-891972280668","Type":"ContainerStarted","Data":"644e1b49b3154e5645c376677a8e42eea138b1a7b483c66781efdad7a09a054e"} Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.885160 5127 generic.go:334] "Generic (PLEG): container finished" podID="f309b19a-5193-4b77-93e0-e0fcc06e2dbc" containerID="c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50" exitCode=0 Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.885293 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.885722 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f309b19a-5193-4b77-93e0-e0fcc06e2dbc","Type":"ContainerDied","Data":"c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50"} Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.885783 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f309b19a-5193-4b77-93e0-e0fcc06e2dbc","Type":"ContainerDied","Data":"ac229484892598d3c21cdc940bfa5b7daa08e62584b966da7ea372ce25aa6e26"} Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.885804 5127 scope.go:117] "RemoveContainer" containerID="c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.924832 5127 generic.go:334] "Generic (PLEG): container finished" podID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerID="03e6a4c7e6336369aa3fa6cb3c84e5fddce779cfc6a3c791cbf95ddc761dcd00" exitCode=0 Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.925308 5127 scope.go:117] "RemoveContainer" containerID="c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.925498 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" event={"ID":"c2c37fca-5c5d-4565-9e8a-8427b2aca11f","Type":"ContainerDied","Data":"03e6a4c7e6336369aa3fa6cb3c84e5fddce779cfc6a3c791cbf95ddc761dcd00"} Feb 01 08:45:39 crc kubenswrapper[5127]: E0201 08:45:39.925995 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50\": container with ID starting with c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50 not found: ID does not exist" containerID="c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.926026 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50"} err="failed to get container status \"c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50\": rpc error: code = NotFound desc = could not find container \"c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50\": container with ID starting with c26b2a9619412e24cfd30ed1b6e0c296f3bd956694e8ff71f14c238bd279da50 not found: ID does not exist" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.933185 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.940031 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.962053 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.973639 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: E0201 08:45:39.974224 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f309b19a-5193-4b77-93e0-e0fcc06e2dbc" containerName="nova-scheduler-scheduler" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.974251 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f309b19a-5193-4b77-93e0-e0fcc06e2dbc" containerName="nova-scheduler-scheduler" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.975038 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f309b19a-5193-4b77-93e0-e0fcc06e2dbc" containerName="nova-scheduler-scheduler" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.977088 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.983474 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:39 crc kubenswrapper[5127]: I0201 08:45:39.989503 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.042546 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94jrl\" (UniqueName: \"kubernetes.io/projected/04b28b6b-380f-408c-9605-3037ac9479ff-kube-api-access-94jrl\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.043030 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.043113 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-config-data\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.096397 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.145387 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-nb\") pod \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.145496 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-dns-svc\") pod \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.145602 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmwpz\" (UniqueName: \"kubernetes.io/projected/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-kube-api-access-lmwpz\") pod \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.145631 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-sb\") pod \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.145736 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-config\") pod \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\" (UID: \"c2c37fca-5c5d-4565-9e8a-8427b2aca11f\") " Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.145918 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.145975 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-config-data\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.146038 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94jrl\" (UniqueName: \"kubernetes.io/projected/04b28b6b-380f-408c-9605-3037ac9479ff-kube-api-access-94jrl\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.151729 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.154640 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-kube-api-access-lmwpz" (OuterVolumeSpecName: "kube-api-access-lmwpz") pod "c2c37fca-5c5d-4565-9e8a-8427b2aca11f" (UID: "c2c37fca-5c5d-4565-9e8a-8427b2aca11f"). InnerVolumeSpecName "kube-api-access-lmwpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.159023 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-config-data\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.162104 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94jrl\" (UniqueName: \"kubernetes.io/projected/04b28b6b-380f-408c-9605-3037ac9479ff-kube-api-access-94jrl\") pod \"nova-scheduler-0\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.201714 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c2c37fca-5c5d-4565-9e8a-8427b2aca11f" (UID: "c2c37fca-5c5d-4565-9e8a-8427b2aca11f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.205737 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c2c37fca-5c5d-4565-9e8a-8427b2aca11f" (UID: "c2c37fca-5c5d-4565-9e8a-8427b2aca11f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.212503 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-config" (OuterVolumeSpecName: "config") pod "c2c37fca-5c5d-4565-9e8a-8427b2aca11f" (UID: "c2c37fca-5c5d-4565-9e8a-8427b2aca11f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.218545 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c2c37fca-5c5d-4565-9e8a-8427b2aca11f" (UID: "c2c37fca-5c5d-4565-9e8a-8427b2aca11f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.247328 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.247372 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.247387 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmwpz\" (UniqueName: \"kubernetes.io/projected/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-kube-api-access-lmwpz\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.247397 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.247408 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2c37fca-5c5d-4565-9e8a-8427b2aca11f-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.256105 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb02bd61-646f-477b-bcee-e2153e570519" path="/var/lib/kubelet/pods/eb02bd61-646f-477b-bcee-e2153e570519/volumes" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.256883 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f309b19a-5193-4b77-93e0-e0fcc06e2dbc" path="/var/lib/kubelet/pods/f309b19a-5193-4b77-93e0-e0fcc06e2dbc/volumes" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.257441 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2f0596-3e22-45b9-a55c-470187b3f661" path="/var/lib/kubelet/pods/fa2f0596-3e22-45b9-a55c-470187b3f661/volumes" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.314808 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.322209 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.859196 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.978335 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" event={"ID":"c2c37fca-5c5d-4565-9e8a-8427b2aca11f","Type":"ContainerDied","Data":"038ceb25d1790ca9632f5bbabef022604334aa12331209bbf8406b33abddb7fa"} Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.978408 5127 scope.go:117] "RemoveContainer" containerID="03e6a4c7e6336369aa3fa6cb3c84e5fddce779cfc6a3c791cbf95ddc761dcd00" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.978634 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b55d5755-jzw5k" Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.987209 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b171b2-8077-4c94-89ae-76b74046161b","Type":"ContainerStarted","Data":"b3036211d9037f15c90961640a7f99d0889af4e7011dc82cd2d8ee649d945792"} Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.987621 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b171b2-8077-4c94-89ae-76b74046161b","Type":"ContainerStarted","Data":"adada28198eedc642b28b58770b23d04b0a75bed16f0b725d2c28a82e9fd0b9c"} Feb 01 08:45:40 crc kubenswrapper[5127]: I0201 08:45:40.987651 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b171b2-8077-4c94-89ae-76b74046161b","Type":"ContainerStarted","Data":"b13e00caaa5d097a9ddc51774c0d7ce5a61a791af59ebc24d776f060a03eccfd"} Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.005236 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e06c7f5-77f6-4f8f-b335-891972280668","Type":"ContainerStarted","Data":"aa30082a48df7a49d1b79ff14307c7195d84a254a03e45253f6b1273360ad7aa"} Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.005276 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e06c7f5-77f6-4f8f-b335-891972280668","Type":"ContainerStarted","Data":"e5cba122bc059baaae6bf0bedaa46df86767b7a2f23c3c122d180c733e97e8ca"} Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.006798 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b55d5755-jzw5k"] Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.011560 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b28b6b-380f-408c-9605-3037ac9479ff","Type":"ContainerStarted","Data":"2403922ad68fbf2607501b56907d474866a620c0f99bc18a626aa479df2b74e8"} Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.018064 5127 scope.go:117] "RemoveContainer" containerID="f2a7860d84c78a5bf8cf0ed725fb03d91047d611f29c3afab34dc05c0d2728b8" Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.027506 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66b55d5755-jzw5k"] Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.050245 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.05022392 podStartE2EDuration="2.05022392s" podCreationTimestamp="2026-02-01 08:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:41.015920367 +0000 UTC m=+7091.501822730" watchObservedRunningTime="2026-02-01 08:45:41.05022392 +0000 UTC m=+7091.536126283" Feb 01 08:45:41 crc kubenswrapper[5127]: I0201 08:45:41.063403 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.063385993 podStartE2EDuration="3.063385993s" podCreationTimestamp="2026-02-01 08:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:41.033281105 +0000 UTC m=+7091.519183458" watchObservedRunningTime="2026-02-01 08:45:41.063385993 +0000 UTC m=+7091.549288346" Feb 01 08:45:42 crc kubenswrapper[5127]: I0201 08:45:42.019897 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b28b6b-380f-408c-9605-3037ac9479ff","Type":"ContainerStarted","Data":"2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890"} Feb 01 08:45:42 crc kubenswrapper[5127]: I0201 08:45:42.047369 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.047302096 podStartE2EDuration="3.047302096s" podCreationTimestamp="2026-02-01 08:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:42.039278452 +0000 UTC m=+7092.525180815" watchObservedRunningTime="2026-02-01 08:45:42.047302096 +0000 UTC m=+7092.533204459" Feb 01 08:45:42 crc kubenswrapper[5127]: I0201 08:45:42.247622 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" path="/var/lib/kubelet/pods/c2c37fca-5c5d-4565-9e8a-8427b2aca11f/volumes" Feb 01 08:45:43 crc kubenswrapper[5127]: I0201 08:45:43.151481 5127 scope.go:117] "RemoveContainer" containerID="87e22cc031b513a908bd0676ad35878354bc55f93252bec8875ef2c86a1ac889" Feb 01 08:45:43 crc kubenswrapper[5127]: I0201 08:45:43.194566 5127 scope.go:117] "RemoveContainer" containerID="6eebf3f94eb7e4f54e2885f2f1dafa042ce828d929b9877695d271cd1dab33ad" Feb 01 08:45:44 crc kubenswrapper[5127]: I0201 08:45:44.731971 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:45:44 crc kubenswrapper[5127]: I0201 08:45:44.732395 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:45:45 crc kubenswrapper[5127]: I0201 08:45:45.315922 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 08:45:46 crc kubenswrapper[5127]: I0201 08:45:46.587437 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.352106 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-r7zwl"] Feb 01 08:45:47 crc kubenswrapper[5127]: E0201 08:45:47.352730 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerName="init" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.352765 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerName="init" Feb 01 08:45:47 crc kubenswrapper[5127]: E0201 08:45:47.352803 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerName="dnsmasq-dns" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.352816 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerName="dnsmasq-dns" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.353106 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c37fca-5c5d-4565-9e8a-8427b2aca11f" containerName="dnsmasq-dns" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.358332 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.361835 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.364695 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.382962 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7zwl"] Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.494743 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.494856 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-config-data\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.495189 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-scripts\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.495307 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht29g\" (UniqueName: \"kubernetes.io/projected/d74da345-e546-4397-a7de-e285df23f3fd-kube-api-access-ht29g\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.596552 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.597759 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-config-data\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.597941 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-scripts\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.598007 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht29g\" (UniqueName: \"kubernetes.io/projected/d74da345-e546-4397-a7de-e285df23f3fd-kube-api-access-ht29g\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.602840 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-scripts\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.602978 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.612770 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-config-data\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.617173 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht29g\" (UniqueName: \"kubernetes.io/projected/d74da345-e546-4397-a7de-e285df23f3fd-kube-api-access-ht29g\") pod \"nova-cell1-cell-mapping-r7zwl\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:47 crc kubenswrapper[5127]: I0201 08:45:47.686609 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:48 crc kubenswrapper[5127]: I0201 08:45:48.204509 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7zwl"] Feb 01 08:45:49 crc kubenswrapper[5127]: I0201 08:45:49.106102 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7zwl" event={"ID":"d74da345-e546-4397-a7de-e285df23f3fd","Type":"ContainerStarted","Data":"2ae06785aebd6b6f7c3ede94dbdcf521f3d0ae20c6bf76e960c6c1bb0e5b897c"} Feb 01 08:45:49 crc kubenswrapper[5127]: I0201 08:45:49.106788 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7zwl" event={"ID":"d74da345-e546-4397-a7de-e285df23f3fd","Type":"ContainerStarted","Data":"1a8c4ce104758c2c80e039a9167acf76b80a9e04e52d5d1af040e413f32576b7"} Feb 01 08:45:49 crc kubenswrapper[5127]: I0201 08:45:49.130848 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-r7zwl" podStartSLOduration=2.130821473 podStartE2EDuration="2.130821473s" podCreationTimestamp="2026-02-01 08:45:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:45:49.122177431 +0000 UTC m=+7099.608079824" watchObservedRunningTime="2026-02-01 08:45:49.130821473 +0000 UTC m=+7099.616723876" Feb 01 08:45:49 crc kubenswrapper[5127]: I0201 08:45:49.306307 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 08:45:49 crc kubenswrapper[5127]: I0201 08:45:49.306382 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 08:45:49 crc kubenswrapper[5127]: I0201 08:45:49.732175 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 08:45:49 crc kubenswrapper[5127]: I0201 08:45:49.732244 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 08:45:50 crc kubenswrapper[5127]: I0201 08:45:50.316729 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 08:45:50 crc kubenswrapper[5127]: I0201 08:45:50.374494 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 08:45:50 crc kubenswrapper[5127]: I0201 08:45:50.394790 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.78:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:45:50 crc kubenswrapper[5127]: I0201 08:45:50.395073 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.78:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:45:50 crc kubenswrapper[5127]: I0201 08:45:50.815963 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.79:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:45:50 crc kubenswrapper[5127]: I0201 08:45:50.816000 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.79:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:45:51 crc kubenswrapper[5127]: I0201 08:45:51.152323 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 08:45:53 crc kubenswrapper[5127]: I0201 08:45:53.165762 5127 generic.go:334] "Generic (PLEG): container finished" podID="d74da345-e546-4397-a7de-e285df23f3fd" containerID="2ae06785aebd6b6f7c3ede94dbdcf521f3d0ae20c6bf76e960c6c1bb0e5b897c" exitCode=0 Feb 01 08:45:53 crc kubenswrapper[5127]: I0201 08:45:53.165844 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7zwl" event={"ID":"d74da345-e546-4397-a7de-e285df23f3fd","Type":"ContainerDied","Data":"2ae06785aebd6b6f7c3ede94dbdcf521f3d0ae20c6bf76e960c6c1bb0e5b897c"} Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.596649 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.733837 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-combined-ca-bundle\") pod \"d74da345-e546-4397-a7de-e285df23f3fd\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.733895 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-scripts\") pod \"d74da345-e546-4397-a7de-e285df23f3fd\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.733959 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-config-data\") pod \"d74da345-e546-4397-a7de-e285df23f3fd\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.734126 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht29g\" (UniqueName: \"kubernetes.io/projected/d74da345-e546-4397-a7de-e285df23f3fd-kube-api-access-ht29g\") pod \"d74da345-e546-4397-a7de-e285df23f3fd\" (UID: \"d74da345-e546-4397-a7de-e285df23f3fd\") " Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.739446 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-scripts" (OuterVolumeSpecName: "scripts") pod "d74da345-e546-4397-a7de-e285df23f3fd" (UID: "d74da345-e546-4397-a7de-e285df23f3fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.739727 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74da345-e546-4397-a7de-e285df23f3fd-kube-api-access-ht29g" (OuterVolumeSpecName: "kube-api-access-ht29g") pod "d74da345-e546-4397-a7de-e285df23f3fd" (UID: "d74da345-e546-4397-a7de-e285df23f3fd"). InnerVolumeSpecName "kube-api-access-ht29g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.759065 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-config-data" (OuterVolumeSpecName: "config-data") pod "d74da345-e546-4397-a7de-e285df23f3fd" (UID: "d74da345-e546-4397-a7de-e285df23f3fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.762423 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d74da345-e546-4397-a7de-e285df23f3fd" (UID: "d74da345-e546-4397-a7de-e285df23f3fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.836607 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht29g\" (UniqueName: \"kubernetes.io/projected/d74da345-e546-4397-a7de-e285df23f3fd-kube-api-access-ht29g\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.836671 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.836684 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:54 crc kubenswrapper[5127]: I0201 08:45:54.836698 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74da345-e546-4397-a7de-e285df23f3fd-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.188751 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-r7zwl" event={"ID":"d74da345-e546-4397-a7de-e285df23f3fd","Type":"ContainerDied","Data":"1a8c4ce104758c2c80e039a9167acf76b80a9e04e52d5d1af040e413f32576b7"} Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.188827 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a8c4ce104758c2c80e039a9167acf76b80a9e04e52d5d1af040e413f32576b7" Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.188838 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-r7zwl" Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.484004 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.487741 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-log" containerID="cri-o://e5cba122bc059baaae6bf0bedaa46df86767b7a2f23c3c122d180c733e97e8ca" gracePeriod=30 Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.487937 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-api" containerID="cri-o://aa30082a48df7a49d1b79ff14307c7195d84a254a03e45253f6b1273360ad7aa" gracePeriod=30 Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.506225 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.506879 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-log" containerID="cri-o://adada28198eedc642b28b58770b23d04b0a75bed16f0b725d2c28a82e9fd0b9c" gracePeriod=30 Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.506989 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-metadata" containerID="cri-o://b3036211d9037f15c90961640a7f99d0889af4e7011dc82cd2d8ee649d945792" gracePeriod=30 Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.533938 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:55 crc kubenswrapper[5127]: I0201 08:45:55.534159 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="04b28b6b-380f-408c-9605-3037ac9479ff" containerName="nova-scheduler-scheduler" containerID="cri-o://2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890" gracePeriod=30 Feb 01 08:45:56 crc kubenswrapper[5127]: I0201 08:45:56.198649 5127 generic.go:334] "Generic (PLEG): container finished" podID="88b171b2-8077-4c94-89ae-76b74046161b" containerID="adada28198eedc642b28b58770b23d04b0a75bed16f0b725d2c28a82e9fd0b9c" exitCode=143 Feb 01 08:45:56 crc kubenswrapper[5127]: I0201 08:45:56.198713 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b171b2-8077-4c94-89ae-76b74046161b","Type":"ContainerDied","Data":"adada28198eedc642b28b58770b23d04b0a75bed16f0b725d2c28a82e9fd0b9c"} Feb 01 08:45:56 crc kubenswrapper[5127]: I0201 08:45:56.200765 5127 generic.go:334] "Generic (PLEG): container finished" podID="0e06c7f5-77f6-4f8f-b335-891972280668" containerID="e5cba122bc059baaae6bf0bedaa46df86767b7a2f23c3c122d180c733e97e8ca" exitCode=143 Feb 01 08:45:56 crc kubenswrapper[5127]: I0201 08:45:56.200793 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e06c7f5-77f6-4f8f-b335-891972280668","Type":"ContainerDied","Data":"e5cba122bc059baaae6bf0bedaa46df86767b7a2f23c3c122d180c733e97e8ca"} Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.754648 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.840094 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-combined-ca-bundle\") pod \"04b28b6b-380f-408c-9605-3037ac9479ff\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.840194 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-config-data\") pod \"04b28b6b-380f-408c-9605-3037ac9479ff\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.841001 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94jrl\" (UniqueName: \"kubernetes.io/projected/04b28b6b-380f-408c-9605-3037ac9479ff-kube-api-access-94jrl\") pod \"04b28b6b-380f-408c-9605-3037ac9479ff\" (UID: \"04b28b6b-380f-408c-9605-3037ac9479ff\") " Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.847958 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b28b6b-380f-408c-9605-3037ac9479ff-kube-api-access-94jrl" (OuterVolumeSpecName: "kube-api-access-94jrl") pod "04b28b6b-380f-408c-9605-3037ac9479ff" (UID: "04b28b6b-380f-408c-9605-3037ac9479ff"). InnerVolumeSpecName "kube-api-access-94jrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.872691 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04b28b6b-380f-408c-9605-3037ac9479ff" (UID: "04b28b6b-380f-408c-9605-3037ac9479ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.875229 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-config-data" (OuterVolumeSpecName: "config-data") pod "04b28b6b-380f-408c-9605-3037ac9479ff" (UID: "04b28b6b-380f-408c-9605-3037ac9479ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.943099 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.943127 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94jrl\" (UniqueName: \"kubernetes.io/projected/04b28b6b-380f-408c-9605-3037ac9479ff-kube-api-access-94jrl\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:57 crc kubenswrapper[5127]: I0201 08:45:57.943140 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b28b6b-380f-408c-9605-3037ac9479ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.218648 5127 generic.go:334] "Generic (PLEG): container finished" podID="04b28b6b-380f-408c-9605-3037ac9479ff" containerID="2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890" exitCode=0 Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.218695 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b28b6b-380f-408c-9605-3037ac9479ff","Type":"ContainerDied","Data":"2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890"} Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.218725 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04b28b6b-380f-408c-9605-3037ac9479ff","Type":"ContainerDied","Data":"2403922ad68fbf2607501b56907d474866a620c0f99bc18a626aa479df2b74e8"} Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.218745 5127 scope.go:117] "RemoveContainer" containerID="2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.219029 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.259337 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.269031 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.272530 5127 scope.go:117] "RemoveContainer" containerID="2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890" Feb 01 08:45:58 crc kubenswrapper[5127]: E0201 08:45:58.273164 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890\": container with ID starting with 2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890 not found: ID does not exist" containerID="2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.273235 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890"} err="failed to get container status \"2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890\": rpc error: code = NotFound desc = could not find container \"2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890\": container with ID starting with 2c67a4e82acb99dfe97386cbe174d4ae9b0ec8d912462534e26fcc5bd748d890 not found: ID does not exist" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.283198 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:58 crc kubenswrapper[5127]: E0201 08:45:58.283655 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b28b6b-380f-408c-9605-3037ac9479ff" containerName="nova-scheduler-scheduler" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.283675 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b28b6b-380f-408c-9605-3037ac9479ff" containerName="nova-scheduler-scheduler" Feb 01 08:45:58 crc kubenswrapper[5127]: E0201 08:45:58.283694 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74da345-e546-4397-a7de-e285df23f3fd" containerName="nova-manage" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.283703 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74da345-e546-4397-a7de-e285df23f3fd" containerName="nova-manage" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.283898 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74da345-e546-4397-a7de-e285df23f3fd" containerName="nova-manage" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.283926 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b28b6b-380f-408c-9605-3037ac9479ff" containerName="nova-scheduler-scheduler" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.286116 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.288785 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.303272 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.352573 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-config-data\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.352646 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.352697 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwtdm\" (UniqueName: \"kubernetes.io/projected/7181dbcd-6372-4645-8426-47aaaf3eb576-kube-api-access-qwtdm\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.453522 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-config-data\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.453583 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.453644 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwtdm\" (UniqueName: \"kubernetes.io/projected/7181dbcd-6372-4645-8426-47aaaf3eb576-kube-api-access-qwtdm\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.458216 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.466380 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-config-data\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.471561 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwtdm\" (UniqueName: \"kubernetes.io/projected/7181dbcd-6372-4645-8426-47aaaf3eb576-kube-api-access-qwtdm\") pod \"nova-scheduler-0\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " pod="openstack/nova-scheduler-0" Feb 01 08:45:58 crc kubenswrapper[5127]: I0201 08:45:58.607074 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.235123 5127 generic.go:334] "Generic (PLEG): container finished" podID="88b171b2-8077-4c94-89ae-76b74046161b" containerID="b3036211d9037f15c90961640a7f99d0889af4e7011dc82cd2d8ee649d945792" exitCode=0 Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.235373 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b171b2-8077-4c94-89ae-76b74046161b","Type":"ContainerDied","Data":"b3036211d9037f15c90961640a7f99d0889af4e7011dc82cd2d8ee649d945792"} Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.242437 5127 generic.go:334] "Generic (PLEG): container finished" podID="0e06c7f5-77f6-4f8f-b335-891972280668" containerID="aa30082a48df7a49d1b79ff14307c7195d84a254a03e45253f6b1273360ad7aa" exitCode=0 Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.242496 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e06c7f5-77f6-4f8f-b335-891972280668","Type":"ContainerDied","Data":"aa30082a48df7a49d1b79ff14307c7195d84a254a03e45253f6b1273360ad7aa"} Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.252529 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.285060 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.329007 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.373800 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b171b2-8077-4c94-89ae-76b74046161b-logs\") pod \"88b171b2-8077-4c94-89ae-76b74046161b\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.373845 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlkhn\" (UniqueName: \"kubernetes.io/projected/0e06c7f5-77f6-4f8f-b335-891972280668-kube-api-access-mlkhn\") pod \"0e06c7f5-77f6-4f8f-b335-891972280668\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.373928 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e06c7f5-77f6-4f8f-b335-891972280668-logs\") pod \"0e06c7f5-77f6-4f8f-b335-891972280668\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.373971 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-combined-ca-bundle\") pod \"88b171b2-8077-4c94-89ae-76b74046161b\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.374502 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e06c7f5-77f6-4f8f-b335-891972280668-logs" (OuterVolumeSpecName: "logs") pod "0e06c7f5-77f6-4f8f-b335-891972280668" (UID: "0e06c7f5-77f6-4f8f-b335-891972280668"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.374543 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88b171b2-8077-4c94-89ae-76b74046161b-logs" (OuterVolumeSpecName: "logs") pod "88b171b2-8077-4c94-89ae-76b74046161b" (UID: "88b171b2-8077-4c94-89ae-76b74046161b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.374600 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-config-data\") pod \"0e06c7f5-77f6-4f8f-b335-891972280668\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.374706 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-config-data\") pod \"88b171b2-8077-4c94-89ae-76b74046161b\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.374735 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfb9g\" (UniqueName: \"kubernetes.io/projected/88b171b2-8077-4c94-89ae-76b74046161b-kube-api-access-wfb9g\") pod \"88b171b2-8077-4c94-89ae-76b74046161b\" (UID: \"88b171b2-8077-4c94-89ae-76b74046161b\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.374780 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-combined-ca-bundle\") pod \"0e06c7f5-77f6-4f8f-b335-891972280668\" (UID: \"0e06c7f5-77f6-4f8f-b335-891972280668\") " Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.375526 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e06c7f5-77f6-4f8f-b335-891972280668-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.375543 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88b171b2-8077-4c94-89ae-76b74046161b-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.378695 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b171b2-8077-4c94-89ae-76b74046161b-kube-api-access-wfb9g" (OuterVolumeSpecName: "kube-api-access-wfb9g") pod "88b171b2-8077-4c94-89ae-76b74046161b" (UID: "88b171b2-8077-4c94-89ae-76b74046161b"). InnerVolumeSpecName "kube-api-access-wfb9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.379608 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e06c7f5-77f6-4f8f-b335-891972280668-kube-api-access-mlkhn" (OuterVolumeSpecName: "kube-api-access-mlkhn") pod "0e06c7f5-77f6-4f8f-b335-891972280668" (UID: "0e06c7f5-77f6-4f8f-b335-891972280668"). InnerVolumeSpecName "kube-api-access-mlkhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.400505 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88b171b2-8077-4c94-89ae-76b74046161b" (UID: "88b171b2-8077-4c94-89ae-76b74046161b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.408550 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-config-data" (OuterVolumeSpecName: "config-data") pod "0e06c7f5-77f6-4f8f-b335-891972280668" (UID: "0e06c7f5-77f6-4f8f-b335-891972280668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.411476 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-config-data" (OuterVolumeSpecName: "config-data") pod "88b171b2-8077-4c94-89ae-76b74046161b" (UID: "88b171b2-8077-4c94-89ae-76b74046161b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.418133 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e06c7f5-77f6-4f8f-b335-891972280668" (UID: "0e06c7f5-77f6-4f8f-b335-891972280668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.477359 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlkhn\" (UniqueName: \"kubernetes.io/projected/0e06c7f5-77f6-4f8f-b335-891972280668-kube-api-access-mlkhn\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.477405 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.477414 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.477427 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88b171b2-8077-4c94-89ae-76b74046161b-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.477438 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfb9g\" (UniqueName: \"kubernetes.io/projected/88b171b2-8077-4c94-89ae-76b74046161b-kube-api-access-wfb9g\") on node \"crc\" DevicePath \"\"" Feb 01 08:45:59 crc kubenswrapper[5127]: I0201 08:45:59.477448 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e06c7f5-77f6-4f8f-b335-891972280668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.264087 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b28b6b-380f-408c-9605-3037ac9479ff" path="/var/lib/kubelet/pods/04b28b6b-380f-408c-9605-3037ac9479ff/volumes" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.294178 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"88b171b2-8077-4c94-89ae-76b74046161b","Type":"ContainerDied","Data":"b13e00caaa5d097a9ddc51774c0d7ce5a61a791af59ebc24d776f060a03eccfd"} Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.294275 5127 scope.go:117] "RemoveContainer" containerID="b3036211d9037f15c90961640a7f99d0889af4e7011dc82cd2d8ee649d945792" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.294318 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.304249 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.304307 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e06c7f5-77f6-4f8f-b335-891972280668","Type":"ContainerDied","Data":"644e1b49b3154e5645c376677a8e42eea138b1a7b483c66781efdad7a09a054e"} Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.320758 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7181dbcd-6372-4645-8426-47aaaf3eb576","Type":"ContainerStarted","Data":"7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5"} Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.320812 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7181dbcd-6372-4645-8426-47aaaf3eb576","Type":"ContainerStarted","Data":"7e87e3268dddae25b828c71e9b209aaff42b4a7a4a3165d54bd226e4b54c7ecc"} Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.329989 5127 scope.go:117] "RemoveContainer" containerID="adada28198eedc642b28b58770b23d04b0a75bed16f0b725d2c28a82e9fd0b9c" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.330922 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.379114 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.393882 5127 scope.go:117] "RemoveContainer" containerID="aa30082a48df7a49d1b79ff14307c7195d84a254a03e45253f6b1273360ad7aa" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.394064 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.398746 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.405437 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: E0201 08:46:00.406226 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-metadata" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406247 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-metadata" Feb 01 08:46:00 crc kubenswrapper[5127]: E0201 08:46:00.406263 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-log" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406269 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-log" Feb 01 08:46:00 crc kubenswrapper[5127]: E0201 08:46:00.406309 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-log" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406316 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-log" Feb 01 08:46:00 crc kubenswrapper[5127]: E0201 08:46:00.406337 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-api" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406342 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-api" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406722 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-log" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406768 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-metadata" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406779 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b171b2-8077-4c94-89ae-76b74046161b" containerName="nova-metadata-log" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.406792 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" containerName="nova-api-api" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.408392 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.408373218 podStartE2EDuration="2.408373218s" podCreationTimestamp="2026-02-01 08:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:46:00.349114326 +0000 UTC m=+7110.835016689" watchObservedRunningTime="2026-02-01 08:46:00.408373218 +0000 UTC m=+7110.894275571" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.409489 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.415676 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.420472 5127 scope.go:117] "RemoveContainer" containerID="e5cba122bc059baaae6bf0bedaa46df86767b7a2f23c3c122d180c733e97e8ca" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.421793 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.423484 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.435048 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.444643 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.445009 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.503533 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-logs\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.503657 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjr9c\" (UniqueName: \"kubernetes.io/projected/64c5b944-c0b8-4afe-a270-25911efaf8fb-kube-api-access-xjr9c\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.503971 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-config-data\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.504064 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-config-data\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.504101 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.504232 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml7ts\" (UniqueName: \"kubernetes.io/projected/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-kube-api-access-ml7ts\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.504294 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64c5b944-c0b8-4afe-a270-25911efaf8fb-logs\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.504436 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606329 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-config-data\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606395 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606479 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml7ts\" (UniqueName: \"kubernetes.io/projected/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-kube-api-access-ml7ts\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606512 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64c5b944-c0b8-4afe-a270-25911efaf8fb-logs\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606571 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606651 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-logs\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606689 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjr9c\" (UniqueName: \"kubernetes.io/projected/64c5b944-c0b8-4afe-a270-25911efaf8fb-kube-api-access-xjr9c\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.606761 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-config-data\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.607000 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64c5b944-c0b8-4afe-a270-25911efaf8fb-logs\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.607330 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-logs\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.612541 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-config-data\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.613290 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.613358 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.619642 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-config-data\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.625273 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml7ts\" (UniqueName: \"kubernetes.io/projected/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-kube-api-access-ml7ts\") pod \"nova-api-0\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " pod="openstack/nova-api-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.629095 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjr9c\" (UniqueName: \"kubernetes.io/projected/64c5b944-c0b8-4afe-a270-25911efaf8fb-kube-api-access-xjr9c\") pod \"nova-metadata-0\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.740089 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:46:00 crc kubenswrapper[5127]: I0201 08:46:00.770651 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:46:01 crc kubenswrapper[5127]: I0201 08:46:01.270606 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:46:01 crc kubenswrapper[5127]: W0201 08:46:01.276220 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64c5b944_c0b8_4afe_a270_25911efaf8fb.slice/crio-b11e74ac2fd8b4f1940c63ca7099538a3684d22ff48e7f570205b42003f22ab8 WatchSource:0}: Error finding container b11e74ac2fd8b4f1940c63ca7099538a3684d22ff48e7f570205b42003f22ab8: Status 404 returned error can't find the container with id b11e74ac2fd8b4f1940c63ca7099538a3684d22ff48e7f570205b42003f22ab8 Feb 01 08:46:01 crc kubenswrapper[5127]: I0201 08:46:01.332767 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64c5b944-c0b8-4afe-a270-25911efaf8fb","Type":"ContainerStarted","Data":"b11e74ac2fd8b4f1940c63ca7099538a3684d22ff48e7f570205b42003f22ab8"} Feb 01 08:46:01 crc kubenswrapper[5127]: I0201 08:46:01.403224 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:46:01 crc kubenswrapper[5127]: W0201 08:46:01.408385 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ef2d54a_e179_42ed_8a5e_60a0ec1606bb.slice/crio-90f4b6747ef929142f1c8e877c5a37034141e6c5acdfe2f01384073c978ac4ee WatchSource:0}: Error finding container 90f4b6747ef929142f1c8e877c5a37034141e6c5acdfe2f01384073c978ac4ee: Status 404 returned error can't find the container with id 90f4b6747ef929142f1c8e877c5a37034141e6c5acdfe2f01384073c978ac4ee Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.245173 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e06c7f5-77f6-4f8f-b335-891972280668" path="/var/lib/kubelet/pods/0e06c7f5-77f6-4f8f-b335-891972280668/volumes" Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.246185 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b171b2-8077-4c94-89ae-76b74046161b" path="/var/lib/kubelet/pods/88b171b2-8077-4c94-89ae-76b74046161b/volumes" Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.346876 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64c5b944-c0b8-4afe-a270-25911efaf8fb","Type":"ContainerStarted","Data":"acf340decbd8b5742ec6c8762fe554f7abeb6f14bbbca5fd0a5dd93f6cd00ff7"} Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.346929 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64c5b944-c0b8-4afe-a270-25911efaf8fb","Type":"ContainerStarted","Data":"1bc54232db9b38feab09634736f918444e1fda9bb98ff872bc6db9be6f7ec4c8"} Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.351385 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb","Type":"ContainerStarted","Data":"9a038261d3adb8fa12352ac4b502ae568f5da97378f1c7d25ba7ecd7d1d8076c"} Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.351436 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb","Type":"ContainerStarted","Data":"2c5cac0f071d6df627852ba9752921bb000b427c77fcfb3a6dbd59e20a17ced4"} Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.351449 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb","Type":"ContainerStarted","Data":"90f4b6747ef929142f1c8e877c5a37034141e6c5acdfe2f01384073c978ac4ee"} Feb 01 08:46:02 crc kubenswrapper[5127]: I0201 08:46:02.376805 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.376781451 podStartE2EDuration="2.376781451s" podCreationTimestamp="2026-02-01 08:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:46:02.364879891 +0000 UTC m=+7112.850782274" watchObservedRunningTime="2026-02-01 08:46:02.376781451 +0000 UTC m=+7112.862683814" Feb 01 08:46:03 crc kubenswrapper[5127]: I0201 08:46:03.608421 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 08:46:05 crc kubenswrapper[5127]: I0201 08:46:05.741100 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:46:05 crc kubenswrapper[5127]: I0201 08:46:05.741536 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:46:08 crc kubenswrapper[5127]: I0201 08:46:08.608330 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 08:46:08 crc kubenswrapper[5127]: I0201 08:46:08.654561 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 08:46:08 crc kubenswrapper[5127]: I0201 08:46:08.672825 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=8.672807742 podStartE2EDuration="8.672807742s" podCreationTimestamp="2026-02-01 08:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:46:02.393664054 +0000 UTC m=+7112.879566457" watchObservedRunningTime="2026-02-01 08:46:08.672807742 +0000 UTC m=+7119.158710095" Feb 01 08:46:09 crc kubenswrapper[5127]: I0201 08:46:09.475565 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 08:46:10 crc kubenswrapper[5127]: I0201 08:46:10.740967 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 08:46:10 crc kubenswrapper[5127]: I0201 08:46:10.741004 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 08:46:10 crc kubenswrapper[5127]: I0201 08:46:10.772271 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 08:46:10 crc kubenswrapper[5127]: I0201 08:46:10.772371 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 08:46:11 crc kubenswrapper[5127]: I0201 08:46:11.867956 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:46:11 crc kubenswrapper[5127]: I0201 08:46:11.908870 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:46:11 crc kubenswrapper[5127]: I0201 08:46:11.908918 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:46:11 crc kubenswrapper[5127]: I0201 08:46:11.908932 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.744541 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.745136 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.748842 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.749508 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.785107 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.786138 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.786186 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 08:46:20 crc kubenswrapper[5127]: I0201 08:46:20.788569 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.578137 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.584224 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.771466 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cc7954dc-ql6pq"] Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.773305 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.781854 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.781902 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnthr\" (UniqueName: \"kubernetes.io/projected/e11ed41e-754f-46f4-af29-99cef23e3ef6-kube-api-access-bnthr\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.781933 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-config\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.781975 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-dns-svc\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.782000 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.793114 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cc7954dc-ql6pq"] Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.882958 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.882996 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnthr\" (UniqueName: \"kubernetes.io/projected/e11ed41e-754f-46f4-af29-99cef23e3ef6-kube-api-access-bnthr\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.883024 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-config\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.883072 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-dns-svc\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.883092 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.883879 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-nb\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.884707 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-config\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.885440 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-dns-svc\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.885932 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-sb\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:21 crc kubenswrapper[5127]: I0201 08:46:21.908627 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnthr\" (UniqueName: \"kubernetes.io/projected/e11ed41e-754f-46f4-af29-99cef23e3ef6-kube-api-access-bnthr\") pod \"dnsmasq-dns-59cc7954dc-ql6pq\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:22 crc kubenswrapper[5127]: I0201 08:46:22.097281 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:22 crc kubenswrapper[5127]: W0201 08:46:22.587746 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode11ed41e_754f_46f4_af29_99cef23e3ef6.slice/crio-99952e5c046f5289b57f93355c7b7290adc99f23163ddd3f9476dfc5e3277a08 WatchSource:0}: Error finding container 99952e5c046f5289b57f93355c7b7290adc99f23163ddd3f9476dfc5e3277a08: Status 404 returned error can't find the container with id 99952e5c046f5289b57f93355c7b7290adc99f23163ddd3f9476dfc5e3277a08 Feb 01 08:46:22 crc kubenswrapper[5127]: I0201 08:46:22.589065 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cc7954dc-ql6pq"] Feb 01 08:46:23 crc kubenswrapper[5127]: I0201 08:46:23.599409 5127 generic.go:334] "Generic (PLEG): container finished" podID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerID="28497879c600eafa78e23dffaf3bb27d9c09a425ec939134d138657831d423cd" exitCode=0 Feb 01 08:46:23 crc kubenswrapper[5127]: I0201 08:46:23.599478 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" event={"ID":"e11ed41e-754f-46f4-af29-99cef23e3ef6","Type":"ContainerDied","Data":"28497879c600eafa78e23dffaf3bb27d9c09a425ec939134d138657831d423cd"} Feb 01 08:46:23 crc kubenswrapper[5127]: I0201 08:46:23.600050 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" event={"ID":"e11ed41e-754f-46f4-af29-99cef23e3ef6","Type":"ContainerStarted","Data":"99952e5c046f5289b57f93355c7b7290adc99f23163ddd3f9476dfc5e3277a08"} Feb 01 08:46:24 crc kubenswrapper[5127]: I0201 08:46:24.608816 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" event={"ID":"e11ed41e-754f-46f4-af29-99cef23e3ef6","Type":"ContainerStarted","Data":"2865fa79137da1928bf6ad7777870c0f2dbd18d0eb1728e079e44304677ebe08"} Feb 01 08:46:24 crc kubenswrapper[5127]: I0201 08:46:24.609318 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:24 crc kubenswrapper[5127]: I0201 08:46:24.642407 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" podStartSLOduration=3.642388114 podStartE2EDuration="3.642388114s" podCreationTimestamp="2026-02-01 08:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:46:24.633138975 +0000 UTC m=+7135.119041378" watchObservedRunningTime="2026-02-01 08:46:24.642388114 +0000 UTC m=+7135.128290477" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.098810 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.177894 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fc8b578c-v5nfr"] Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.178473 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" podUID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerName="dnsmasq-dns" containerID="cri-o://d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e" gracePeriod=10 Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.710554 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.712124 5127 generic.go:334] "Generic (PLEG): container finished" podID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerID="d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e" exitCode=0 Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.712169 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" event={"ID":"c71f4d2e-f115-44a7-bd74-aa0104e156ab","Type":"ContainerDied","Data":"d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e"} Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.712208 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" event={"ID":"c71f4d2e-f115-44a7-bd74-aa0104e156ab","Type":"ContainerDied","Data":"0216ec29c0c158afb49f9283d8d66c455ddfd81a7979e69648aaff4a0897a0eb"} Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.712229 5127 scope.go:117] "RemoveContainer" containerID="d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.734705 5127 scope.go:117] "RemoveContainer" containerID="b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.759395 5127 scope.go:117] "RemoveContainer" containerID="d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e" Feb 01 08:46:32 crc kubenswrapper[5127]: E0201 08:46:32.759992 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e\": container with ID starting with d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e not found: ID does not exist" containerID="d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.760023 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e"} err="failed to get container status \"d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e\": rpc error: code = NotFound desc = could not find container \"d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e\": container with ID starting with d3b8797b1fb581feb759155089be9d7c4f6348dc3c8b3a1bd41f115cf6330d6e not found: ID does not exist" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.760044 5127 scope.go:117] "RemoveContainer" containerID="b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31" Feb 01 08:46:32 crc kubenswrapper[5127]: E0201 08:46:32.771717 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31\": container with ID starting with b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31 not found: ID does not exist" containerID="b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.771775 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31"} err="failed to get container status \"b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31\": rpc error: code = NotFound desc = could not find container \"b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31\": container with ID starting with b976ff3ecda001ffbed088d48acfa249599221a03b482afe14380f69a4b61d31 not found: ID does not exist" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.803735 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-nb\") pod \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.803836 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-dns-svc\") pod \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.803861 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-config\") pod \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.803894 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6vh9\" (UniqueName: \"kubernetes.io/projected/c71f4d2e-f115-44a7-bd74-aa0104e156ab-kube-api-access-l6vh9\") pod \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.803944 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-sb\") pod \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\" (UID: \"c71f4d2e-f115-44a7-bd74-aa0104e156ab\") " Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.819168 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c71f4d2e-f115-44a7-bd74-aa0104e156ab-kube-api-access-l6vh9" (OuterVolumeSpecName: "kube-api-access-l6vh9") pod "c71f4d2e-f115-44a7-bd74-aa0104e156ab" (UID: "c71f4d2e-f115-44a7-bd74-aa0104e156ab"). InnerVolumeSpecName "kube-api-access-l6vh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.915860 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6vh9\" (UniqueName: \"kubernetes.io/projected/c71f4d2e-f115-44a7-bd74-aa0104e156ab-kube-api-access-l6vh9\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.916979 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c71f4d2e-f115-44a7-bd74-aa0104e156ab" (UID: "c71f4d2e-f115-44a7-bd74-aa0104e156ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.933098 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c71f4d2e-f115-44a7-bd74-aa0104e156ab" (UID: "c71f4d2e-f115-44a7-bd74-aa0104e156ab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.963198 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c71f4d2e-f115-44a7-bd74-aa0104e156ab" (UID: "c71f4d2e-f115-44a7-bd74-aa0104e156ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:46:32 crc kubenswrapper[5127]: I0201 08:46:32.983075 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-config" (OuterVolumeSpecName: "config") pod "c71f4d2e-f115-44a7-bd74-aa0104e156ab" (UID: "c71f4d2e-f115-44a7-bd74-aa0104e156ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:46:33 crc kubenswrapper[5127]: I0201 08:46:33.017784 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:33 crc kubenswrapper[5127]: I0201 08:46:33.017826 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:33 crc kubenswrapper[5127]: I0201 08:46:33.017838 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:33 crc kubenswrapper[5127]: I0201 08:46:33.017846 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c71f4d2e-f115-44a7-bd74-aa0104e156ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:33 crc kubenswrapper[5127]: I0201 08:46:33.727375 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fc8b578c-v5nfr" Feb 01 08:46:33 crc kubenswrapper[5127]: I0201 08:46:33.796508 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fc8b578c-v5nfr"] Feb 01 08:46:33 crc kubenswrapper[5127]: I0201 08:46:33.799388 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fc8b578c-v5nfr"] Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.249938 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" path="/var/lib/kubelet/pods/c71f4d2e-f115-44a7-bd74-aa0104e156ab/volumes" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.669630 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-p2tgk"] Feb 01 08:46:34 crc kubenswrapper[5127]: E0201 08:46:34.670008 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerName="dnsmasq-dns" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.670022 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerName="dnsmasq-dns" Feb 01 08:46:34 crc kubenswrapper[5127]: E0201 08:46:34.670060 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerName="init" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.670066 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerName="init" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.670244 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c71f4d2e-f115-44a7-bd74-aa0104e156ab" containerName="dnsmasq-dns" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.670839 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.679267 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p2tgk"] Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.691654 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-1d5c-account-create-update-m2g4t"] Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.692944 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.695278 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.700077 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1d5c-account-create-update-m2g4t"] Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.757172 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-operator-scripts\") pod \"cinder-db-create-p2tgk\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.757286 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwk4b\" (UniqueName: \"kubernetes.io/projected/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-kube-api-access-qwk4b\") pod \"cinder-db-create-p2tgk\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.859043 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-operator-scripts\") pod \"cinder-db-create-p2tgk\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.859144 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/337262d4-08b6-4c72-82eb-7b5c230e384b-operator-scripts\") pod \"cinder-1d5c-account-create-update-m2g4t\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.859205 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwk4b\" (UniqueName: \"kubernetes.io/projected/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-kube-api-access-qwk4b\") pod \"cinder-db-create-p2tgk\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.859244 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxdq\" (UniqueName: \"kubernetes.io/projected/337262d4-08b6-4c72-82eb-7b5c230e384b-kube-api-access-mdxdq\") pod \"cinder-1d5c-account-create-update-m2g4t\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.860531 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-operator-scripts\") pod \"cinder-db-create-p2tgk\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.882852 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwk4b\" (UniqueName: \"kubernetes.io/projected/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-kube-api-access-qwk4b\") pod \"cinder-db-create-p2tgk\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.961041 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/337262d4-08b6-4c72-82eb-7b5c230e384b-operator-scripts\") pod \"cinder-1d5c-account-create-update-m2g4t\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.961130 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxdq\" (UniqueName: \"kubernetes.io/projected/337262d4-08b6-4c72-82eb-7b5c230e384b-kube-api-access-mdxdq\") pod \"cinder-1d5c-account-create-update-m2g4t\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.961835 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/337262d4-08b6-4c72-82eb-7b5c230e384b-operator-scripts\") pod \"cinder-1d5c-account-create-update-m2g4t\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.980162 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxdq\" (UniqueName: \"kubernetes.io/projected/337262d4-08b6-4c72-82eb-7b5c230e384b-kube-api-access-mdxdq\") pod \"cinder-1d5c-account-create-update-m2g4t\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:34 crc kubenswrapper[5127]: I0201 08:46:34.989005 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:35 crc kubenswrapper[5127]: I0201 08:46:35.018614 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:35 crc kubenswrapper[5127]: I0201 08:46:35.494742 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-p2tgk"] Feb 01 08:46:35 crc kubenswrapper[5127]: W0201 08:46:35.560545 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod337262d4_08b6_4c72_82eb_7b5c230e384b.slice/crio-063feafaedf4d8ddb05802d980457b6f4b420c98c421798a45f87e0543aa7e5b WatchSource:0}: Error finding container 063feafaedf4d8ddb05802d980457b6f4b420c98c421798a45f87e0543aa7e5b: Status 404 returned error can't find the container with id 063feafaedf4d8ddb05802d980457b6f4b420c98c421798a45f87e0543aa7e5b Feb 01 08:46:35 crc kubenswrapper[5127]: I0201 08:46:35.567213 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-1d5c-account-create-update-m2g4t"] Feb 01 08:46:35 crc kubenswrapper[5127]: I0201 08:46:35.746230 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1d5c-account-create-update-m2g4t" event={"ID":"337262d4-08b6-4c72-82eb-7b5c230e384b","Type":"ContainerStarted","Data":"063feafaedf4d8ddb05802d980457b6f4b420c98c421798a45f87e0543aa7e5b"} Feb 01 08:46:35 crc kubenswrapper[5127]: I0201 08:46:35.747956 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p2tgk" event={"ID":"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a","Type":"ContainerStarted","Data":"0b1cfd03845ed98280b6782c940383e5524925d7e38d307d05c32d3e7aa8ef3f"} Feb 01 08:46:35 crc kubenswrapper[5127]: I0201 08:46:35.747986 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p2tgk" event={"ID":"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a","Type":"ContainerStarted","Data":"d2f148423c68d6268feb4e05779b088c50fde49f6ce9234001341a4e6740b320"} Feb 01 08:46:35 crc kubenswrapper[5127]: I0201 08:46:35.766639 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-p2tgk" podStartSLOduration=1.7666164690000001 podStartE2EDuration="1.766616469s" podCreationTimestamp="2026-02-01 08:46:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:46:35.765247572 +0000 UTC m=+7146.251149955" watchObservedRunningTime="2026-02-01 08:46:35.766616469 +0000 UTC m=+7146.252518842" Feb 01 08:46:36 crc kubenswrapper[5127]: I0201 08:46:36.761263 5127 generic.go:334] "Generic (PLEG): container finished" podID="337262d4-08b6-4c72-82eb-7b5c230e384b" containerID="c1504ede8b2f6da26e2c57585acad76bdd4f079215e12ab362e163b6ac73e19a" exitCode=0 Feb 01 08:46:36 crc kubenswrapper[5127]: I0201 08:46:36.761392 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1d5c-account-create-update-m2g4t" event={"ID":"337262d4-08b6-4c72-82eb-7b5c230e384b","Type":"ContainerDied","Data":"c1504ede8b2f6da26e2c57585acad76bdd4f079215e12ab362e163b6ac73e19a"} Feb 01 08:46:36 crc kubenswrapper[5127]: I0201 08:46:36.764643 5127 generic.go:334] "Generic (PLEG): container finished" podID="a2af6f30-000e-4d33-ae4c-e26cdd6ee07a" containerID="0b1cfd03845ed98280b6782c940383e5524925d7e38d307d05c32d3e7aa8ef3f" exitCode=0 Feb 01 08:46:36 crc kubenswrapper[5127]: I0201 08:46:36.764699 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p2tgk" event={"ID":"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a","Type":"ContainerDied","Data":"0b1cfd03845ed98280b6782c940383e5524925d7e38d307d05c32d3e7aa8ef3f"} Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.226809 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.236518 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.323999 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/337262d4-08b6-4c72-82eb-7b5c230e384b-operator-scripts\") pod \"337262d4-08b6-4c72-82eb-7b5c230e384b\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.324073 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdxdq\" (UniqueName: \"kubernetes.io/projected/337262d4-08b6-4c72-82eb-7b5c230e384b-kube-api-access-mdxdq\") pod \"337262d4-08b6-4c72-82eb-7b5c230e384b\" (UID: \"337262d4-08b6-4c72-82eb-7b5c230e384b\") " Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.324092 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwk4b\" (UniqueName: \"kubernetes.io/projected/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-kube-api-access-qwk4b\") pod \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.324163 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-operator-scripts\") pod \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\" (UID: \"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a\") " Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.325048 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/337262d4-08b6-4c72-82eb-7b5c230e384b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "337262d4-08b6-4c72-82eb-7b5c230e384b" (UID: "337262d4-08b6-4c72-82eb-7b5c230e384b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.325116 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a2af6f30-000e-4d33-ae4c-e26cdd6ee07a" (UID: "a2af6f30-000e-4d33-ae4c-e26cdd6ee07a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.329322 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-kube-api-access-qwk4b" (OuterVolumeSpecName: "kube-api-access-qwk4b") pod "a2af6f30-000e-4d33-ae4c-e26cdd6ee07a" (UID: "a2af6f30-000e-4d33-ae4c-e26cdd6ee07a"). InnerVolumeSpecName "kube-api-access-qwk4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.331127 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337262d4-08b6-4c72-82eb-7b5c230e384b-kube-api-access-mdxdq" (OuterVolumeSpecName: "kube-api-access-mdxdq") pod "337262d4-08b6-4c72-82eb-7b5c230e384b" (UID: "337262d4-08b6-4c72-82eb-7b5c230e384b"). InnerVolumeSpecName "kube-api-access-mdxdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.426167 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.426390 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/337262d4-08b6-4c72-82eb-7b5c230e384b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.426454 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdxdq\" (UniqueName: \"kubernetes.io/projected/337262d4-08b6-4c72-82eb-7b5c230e384b-kube-api-access-mdxdq\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.426525 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwk4b\" (UniqueName: \"kubernetes.io/projected/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a-kube-api-access-qwk4b\") on node \"crc\" DevicePath \"\"" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.790029 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-p2tgk" event={"ID":"a2af6f30-000e-4d33-ae4c-e26cdd6ee07a","Type":"ContainerDied","Data":"d2f148423c68d6268feb4e05779b088c50fde49f6ce9234001341a4e6740b320"} Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.790082 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2f148423c68d6268feb4e05779b088c50fde49f6ce9234001341a4e6740b320" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.790151 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-p2tgk" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.798289 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-1d5c-account-create-update-m2g4t" event={"ID":"337262d4-08b6-4c72-82eb-7b5c230e384b","Type":"ContainerDied","Data":"063feafaedf4d8ddb05802d980457b6f4b420c98c421798a45f87e0543aa7e5b"} Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.798372 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="063feafaedf4d8ddb05802d980457b6f4b420c98c421798a45f87e0543aa7e5b" Feb 01 08:46:38 crc kubenswrapper[5127]: I0201 08:46:38.798390 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-1d5c-account-create-update-m2g4t" Feb 01 08:46:39 crc kubenswrapper[5127]: I0201 08:46:39.989528 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-b8lmt"] Feb 01 08:46:39 crc kubenswrapper[5127]: E0201 08:46:39.990328 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337262d4-08b6-4c72-82eb-7b5c230e384b" containerName="mariadb-account-create-update" Feb 01 08:46:39 crc kubenswrapper[5127]: I0201 08:46:39.990347 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="337262d4-08b6-4c72-82eb-7b5c230e384b" containerName="mariadb-account-create-update" Feb 01 08:46:39 crc kubenswrapper[5127]: E0201 08:46:39.990369 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2af6f30-000e-4d33-ae4c-e26cdd6ee07a" containerName="mariadb-database-create" Feb 01 08:46:39 crc kubenswrapper[5127]: I0201 08:46:39.990376 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2af6f30-000e-4d33-ae4c-e26cdd6ee07a" containerName="mariadb-database-create" Feb 01 08:46:39 crc kubenswrapper[5127]: I0201 08:46:39.990566 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2af6f30-000e-4d33-ae4c-e26cdd6ee07a" containerName="mariadb-database-create" Feb 01 08:46:39 crc kubenswrapper[5127]: I0201 08:46:39.990606 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="337262d4-08b6-4c72-82eb-7b5c230e384b" containerName="mariadb-account-create-update" Feb 01 08:46:39 crc kubenswrapper[5127]: I0201 08:46:39.991380 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:39.999950 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b8lmt"] Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.039455 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.039457 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ns9jb" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.039717 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.074797 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ccb1b5ac-6358-4824-8649-5c48340d4349-etc-machine-id\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.074905 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp2f8\" (UniqueName: \"kubernetes.io/projected/ccb1b5ac-6358-4824-8649-5c48340d4349-kube-api-access-gp2f8\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.074963 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-config-data\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.075006 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-combined-ca-bundle\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.075037 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-scripts\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.075082 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-db-sync-config-data\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.176798 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ccb1b5ac-6358-4824-8649-5c48340d4349-etc-machine-id\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.176429 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ccb1b5ac-6358-4824-8649-5c48340d4349-etc-machine-id\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.177340 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp2f8\" (UniqueName: \"kubernetes.io/projected/ccb1b5ac-6358-4824-8649-5c48340d4349-kube-api-access-gp2f8\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.177883 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-config-data\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.179160 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-combined-ca-bundle\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.179224 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-scripts\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.179455 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-db-sync-config-data\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.189527 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-db-sync-config-data\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.189925 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-combined-ca-bundle\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.190698 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-config-data\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.193147 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-scripts\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.195440 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp2f8\" (UniqueName: \"kubernetes.io/projected/ccb1b5ac-6358-4824-8649-5c48340d4349-kube-api-access-gp2f8\") pod \"cinder-db-sync-b8lmt\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.356639 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:46:40 crc kubenswrapper[5127]: I0201 08:46:40.998673 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b8lmt"] Feb 01 08:46:41 crc kubenswrapper[5127]: W0201 08:46:41.004611 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb1b5ac_6358_4824_8649_5c48340d4349.slice/crio-7f752f81a403790756812607cbd920a1f71aaa5f6afc6f51bb3c53e22d896473 WatchSource:0}: Error finding container 7f752f81a403790756812607cbd920a1f71aaa5f6afc6f51bb3c53e22d896473: Status 404 returned error can't find the container with id 7f752f81a403790756812607cbd920a1f71aaa5f6afc6f51bb3c53e22d896473 Feb 01 08:46:41 crc kubenswrapper[5127]: I0201 08:46:41.827988 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8lmt" event={"ID":"ccb1b5ac-6358-4824-8649-5c48340d4349","Type":"ContainerStarted","Data":"7f752f81a403790756812607cbd920a1f71aaa5f6afc6f51bb3c53e22d896473"} Feb 01 08:47:01 crc kubenswrapper[5127]: E0201 08:47:01.945069 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:47:01 crc kubenswrapper[5127]: E0201 08:47:01.945705 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 08:47:01 crc kubenswrapper[5127]: E0201 08:47:01.945932 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gp2f8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-b8lmt_openstack(ccb1b5ac-6358-4824-8649-5c48340d4349): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 08:47:01 crc kubenswrapper[5127]: E0201 08:47:01.947506 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-b8lmt" podUID="ccb1b5ac-6358-4824-8649-5c48340d4349" Feb 01 08:47:02 crc kubenswrapper[5127]: E0201 08:47:02.081662 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/cinder-db-sync-b8lmt" podUID="ccb1b5ac-6358-4824-8649-5c48340d4349" Feb 01 08:47:06 crc kubenswrapper[5127]: I0201 08:47:06.740623 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:47:06 crc kubenswrapper[5127]: I0201 08:47:06.740994 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:47:18 crc kubenswrapper[5127]: I0201 08:47:18.274175 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8lmt" event={"ID":"ccb1b5ac-6358-4824-8649-5c48340d4349","Type":"ContainerStarted","Data":"1fcbba84458820a6cd56e7847b84698f6911603ed0c37b59c69400f84044c11e"} Feb 01 08:47:18 crc kubenswrapper[5127]: I0201 08:47:18.289844 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-b8lmt" podStartSLOduration=2.927587571 podStartE2EDuration="39.289826597s" podCreationTimestamp="2026-02-01 08:46:39 +0000 UTC" firstStartedPulling="2026-02-01 08:46:41.006768072 +0000 UTC m=+7151.492670435" lastFinishedPulling="2026-02-01 08:47:17.369007098 +0000 UTC m=+7187.854909461" observedRunningTime="2026-02-01 08:47:18.287752011 +0000 UTC m=+7188.773654394" watchObservedRunningTime="2026-02-01 08:47:18.289826597 +0000 UTC m=+7188.775728960" Feb 01 08:47:21 crc kubenswrapper[5127]: I0201 08:47:21.315572 5127 generic.go:334] "Generic (PLEG): container finished" podID="ccb1b5ac-6358-4824-8649-5c48340d4349" containerID="1fcbba84458820a6cd56e7847b84698f6911603ed0c37b59c69400f84044c11e" exitCode=0 Feb 01 08:47:21 crc kubenswrapper[5127]: I0201 08:47:21.315751 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8lmt" event={"ID":"ccb1b5ac-6358-4824-8649-5c48340d4349","Type":"ContainerDied","Data":"1fcbba84458820a6cd56e7847b84698f6911603ed0c37b59c69400f84044c11e"} Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.705370 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.890680 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ccb1b5ac-6358-4824-8649-5c48340d4349-etc-machine-id\") pod \"ccb1b5ac-6358-4824-8649-5c48340d4349\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.890809 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccb1b5ac-6358-4824-8649-5c48340d4349-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ccb1b5ac-6358-4824-8649-5c48340d4349" (UID: "ccb1b5ac-6358-4824-8649-5c48340d4349"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.891081 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-config-data\") pod \"ccb1b5ac-6358-4824-8649-5c48340d4349\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.892166 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp2f8\" (UniqueName: \"kubernetes.io/projected/ccb1b5ac-6358-4824-8649-5c48340d4349-kube-api-access-gp2f8\") pod \"ccb1b5ac-6358-4824-8649-5c48340d4349\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.892218 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-db-sync-config-data\") pod \"ccb1b5ac-6358-4824-8649-5c48340d4349\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.892263 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-scripts\") pod \"ccb1b5ac-6358-4824-8649-5c48340d4349\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.892318 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-combined-ca-bundle\") pod \"ccb1b5ac-6358-4824-8649-5c48340d4349\" (UID: \"ccb1b5ac-6358-4824-8649-5c48340d4349\") " Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.892865 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ccb1b5ac-6358-4824-8649-5c48340d4349-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.897668 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-scripts" (OuterVolumeSpecName: "scripts") pod "ccb1b5ac-6358-4824-8649-5c48340d4349" (UID: "ccb1b5ac-6358-4824-8649-5c48340d4349"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.899272 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ccb1b5ac-6358-4824-8649-5c48340d4349" (UID: "ccb1b5ac-6358-4824-8649-5c48340d4349"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.899872 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb1b5ac-6358-4824-8649-5c48340d4349-kube-api-access-gp2f8" (OuterVolumeSpecName: "kube-api-access-gp2f8") pod "ccb1b5ac-6358-4824-8649-5c48340d4349" (UID: "ccb1b5ac-6358-4824-8649-5c48340d4349"). InnerVolumeSpecName "kube-api-access-gp2f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.939726 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccb1b5ac-6358-4824-8649-5c48340d4349" (UID: "ccb1b5ac-6358-4824-8649-5c48340d4349"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.956702 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-config-data" (OuterVolumeSpecName: "config-data") pod "ccb1b5ac-6358-4824-8649-5c48340d4349" (UID: "ccb1b5ac-6358-4824-8649-5c48340d4349"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.995895 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp2f8\" (UniqueName: \"kubernetes.io/projected/ccb1b5ac-6358-4824-8649-5c48340d4349-kube-api-access-gp2f8\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.995962 5127 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.995974 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.995985 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:22 crc kubenswrapper[5127]: I0201 08:47:22.996007 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb1b5ac-6358-4824-8649-5c48340d4349-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.335385 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8lmt" event={"ID":"ccb1b5ac-6358-4824-8649-5c48340d4349","Type":"ContainerDied","Data":"7f752f81a403790756812607cbd920a1f71aaa5f6afc6f51bb3c53e22d896473"} Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.335435 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f752f81a403790756812607cbd920a1f71aaa5f6afc6f51bb3c53e22d896473" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.335496 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8lmt" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.770988 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f67595cb7-ppcf5"] Feb 01 08:47:23 crc kubenswrapper[5127]: E0201 08:47:23.771382 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb1b5ac-6358-4824-8649-5c48340d4349" containerName="cinder-db-sync" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.771395 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb1b5ac-6358-4824-8649-5c48340d4349" containerName="cinder-db-sync" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.771550 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb1b5ac-6358-4824-8649-5c48340d4349" containerName="cinder-db-sync" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.772518 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.810782 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-config\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.810870 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-sb\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.810904 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-nb\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.810947 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-dns-svc\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.810977 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7jbq\" (UniqueName: \"kubernetes.io/projected/c539054e-2748-4c21-ab77-e8720cbc02bf-kube-api-access-v7jbq\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.811927 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f67595cb7-ppcf5"] Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.911767 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-dns-svc\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.911821 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7jbq\" (UniqueName: \"kubernetes.io/projected/c539054e-2748-4c21-ab77-e8720cbc02bf-kube-api-access-v7jbq\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.911892 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-config\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.911941 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-sb\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.911966 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-nb\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.912739 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-nb\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.913424 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-dns-svc\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.917741 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-config\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.919334 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-sb\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.938619 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7jbq\" (UniqueName: \"kubernetes.io/projected/c539054e-2748-4c21-ab77-e8720cbc02bf-kube-api-access-v7jbq\") pod \"dnsmasq-dns-6f67595cb7-ppcf5\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.956680 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.967366 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.974622 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ns9jb" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.979307 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.979486 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.979616 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 01 08:47:23 crc kubenswrapper[5127]: I0201 08:47:23.989551 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.014724 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjfrr\" (UniqueName: \"kubernetes.io/projected/6aff8d68-2451-4781-82a0-d9448018ce3b-kube-api-access-kjfrr\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.014787 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-scripts\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.014830 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.014858 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.014881 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aff8d68-2451-4781-82a0-d9448018ce3b-logs\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.015112 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aff8d68-2451-4781-82a0-d9448018ce3b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.015237 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data-custom\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.095050 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120161 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aff8d68-2451-4781-82a0-d9448018ce3b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120240 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data-custom\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120290 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjfrr\" (UniqueName: \"kubernetes.io/projected/6aff8d68-2451-4781-82a0-d9448018ce3b-kube-api-access-kjfrr\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120314 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-scripts\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120347 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120371 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120384 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aff8d68-2451-4781-82a0-d9448018ce3b-logs\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120780 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aff8d68-2451-4781-82a0-d9448018ce3b-logs\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.120831 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aff8d68-2451-4781-82a0-d9448018ce3b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.136748 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-scripts\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.137276 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data-custom\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.137320 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.137687 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.161338 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjfrr\" (UniqueName: \"kubernetes.io/projected/6aff8d68-2451-4781-82a0-d9448018ce3b-kube-api-access-kjfrr\") pod \"cinder-api-0\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.336502 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.682096 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f67595cb7-ppcf5"] Feb 01 08:47:24 crc kubenswrapper[5127]: I0201 08:47:24.855421 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:47:25 crc kubenswrapper[5127]: I0201 08:47:25.355718 5127 generic.go:334] "Generic (PLEG): container finished" podID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerID="c819be784ccfaf353fcd5d1ffea9892db74d63e90b791956c589d37f8c4b2faa" exitCode=0 Feb 01 08:47:25 crc kubenswrapper[5127]: I0201 08:47:25.355857 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" event={"ID":"c539054e-2748-4c21-ab77-e8720cbc02bf","Type":"ContainerDied","Data":"c819be784ccfaf353fcd5d1ffea9892db74d63e90b791956c589d37f8c4b2faa"} Feb 01 08:47:25 crc kubenswrapper[5127]: I0201 08:47:25.356176 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" event={"ID":"c539054e-2748-4c21-ab77-e8720cbc02bf","Type":"ContainerStarted","Data":"fed838cd03164e50dac93e2c13e5a78f1c395d9145ea07e08161acf0361d0188"} Feb 01 08:47:25 crc kubenswrapper[5127]: I0201 08:47:25.365638 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6aff8d68-2451-4781-82a0-d9448018ce3b","Type":"ContainerStarted","Data":"3c376aed6e5fb9d11935db819a78d9c5cf91a6c51bf7b040d78439138abfe9bf"} Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.424443 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6aff8d68-2451-4781-82a0-d9448018ce3b","Type":"ContainerStarted","Data":"dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21"} Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.424899 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6aff8d68-2451-4781-82a0-d9448018ce3b","Type":"ContainerStarted","Data":"4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee"} Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.426706 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.431623 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" event={"ID":"c539054e-2748-4c21-ab77-e8720cbc02bf","Type":"ContainerStarted","Data":"cc584ee8cdc0b72db97d721bfb579c2037f5ef64aa41be2f9fa027c723163f7b"} Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.432190 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.453951 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.453935876 podStartE2EDuration="3.453935876s" podCreationTimestamp="2026-02-01 08:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:26.451509461 +0000 UTC m=+7196.937411824" watchObservedRunningTime="2026-02-01 08:47:26.453935876 +0000 UTC m=+7196.939838239" Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.477028 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" podStartSLOduration=3.477009226 podStartE2EDuration="3.477009226s" podCreationTimestamp="2026-02-01 08:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:26.472652789 +0000 UTC m=+7196.958555152" watchObservedRunningTime="2026-02-01 08:47:26.477009226 +0000 UTC m=+7196.962911589" Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.745403 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.746002 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7181dbcd-6372-4645-8426-47aaaf3eb576" containerName="nova-scheduler-scheduler" containerID="cri-o://7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5" gracePeriod=30 Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.755430 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.755684 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="b90f8b5c-58e6-48e5-af62-2e0736a6895f" containerName="nova-cell0-conductor-conductor" containerID="cri-o://39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" gracePeriod=30 Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.771774 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.772034 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="de093257-3d61-4df8-86af-1000c1964ff3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://01e7a40c09a8d30251a233d1c1a65779b4fdcc33891f92d5910592c994dcc59d" gracePeriod=30 Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.789203 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.789511 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-log" containerID="cri-o://2c5cac0f071d6df627852ba9752921bb000b427c77fcfb3a6dbd59e20a17ced4" gracePeriod=30 Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.789605 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-api" containerID="cri-o://9a038261d3adb8fa12352ac4b502ae568f5da97378f1c7d25ba7ecd7d1d8076c" gracePeriod=30 Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.802237 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.802456 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-log" containerID="cri-o://1bc54232db9b38feab09634736f918444e1fda9bb98ff872bc6db9be6f7ec4c8" gracePeriod=30 Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.802760 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-metadata" containerID="cri-o://acf340decbd8b5742ec6c8762fe554f7abeb6f14bbbca5fd0a5dd93f6cd00ff7" gracePeriod=30 Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.913171 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:47:26 crc kubenswrapper[5127]: I0201 08:47:26.913448 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="89951ae8-f890-4ce2-9146-fed7435253c5" containerName="nova-cell1-conductor-conductor" containerID="cri-o://6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6" gracePeriod=30 Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.444663 5127 generic.go:334] "Generic (PLEG): container finished" podID="de093257-3d61-4df8-86af-1000c1964ff3" containerID="01e7a40c09a8d30251a233d1c1a65779b4fdcc33891f92d5910592c994dcc59d" exitCode=0 Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.444764 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de093257-3d61-4df8-86af-1000c1964ff3","Type":"ContainerDied","Data":"01e7a40c09a8d30251a233d1c1a65779b4fdcc33891f92d5910592c994dcc59d"} Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.449326 5127 generic.go:334] "Generic (PLEG): container finished" podID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerID="1bc54232db9b38feab09634736f918444e1fda9bb98ff872bc6db9be6f7ec4c8" exitCode=143 Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.449385 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64c5b944-c0b8-4afe-a270-25911efaf8fb","Type":"ContainerDied","Data":"1bc54232db9b38feab09634736f918444e1fda9bb98ff872bc6db9be6f7ec4c8"} Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.452253 5127 generic.go:334] "Generic (PLEG): container finished" podID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerID="2c5cac0f071d6df627852ba9752921bb000b427c77fcfb3a6dbd59e20a17ced4" exitCode=143 Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.452316 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb","Type":"ContainerDied","Data":"2c5cac0f071d6df627852ba9752921bb000b427c77fcfb3a6dbd59e20a17ced4"} Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.814559 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.918459 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-combined-ca-bundle\") pod \"de093257-3d61-4df8-86af-1000c1964ff3\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.918657 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s6n7\" (UniqueName: \"kubernetes.io/projected/de093257-3d61-4df8-86af-1000c1964ff3-kube-api-access-8s6n7\") pod \"de093257-3d61-4df8-86af-1000c1964ff3\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.918974 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-config-data\") pod \"de093257-3d61-4df8-86af-1000c1964ff3\" (UID: \"de093257-3d61-4df8-86af-1000c1964ff3\") " Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.927795 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de093257-3d61-4df8-86af-1000c1964ff3-kube-api-access-8s6n7" (OuterVolumeSpecName: "kube-api-access-8s6n7") pod "de093257-3d61-4df8-86af-1000c1964ff3" (UID: "de093257-3d61-4df8-86af-1000c1964ff3"). InnerVolumeSpecName "kube-api-access-8s6n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.966982 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de093257-3d61-4df8-86af-1000c1964ff3" (UID: "de093257-3d61-4df8-86af-1000c1964ff3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:27 crc kubenswrapper[5127]: I0201 08:47:27.987289 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-config-data" (OuterVolumeSpecName: "config-data") pod "de093257-3d61-4df8-86af-1000c1964ff3" (UID: "de093257-3d61-4df8-86af-1000c1964ff3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.020523 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s6n7\" (UniqueName: \"kubernetes.io/projected/de093257-3d61-4df8-86af-1000c1964ff3-kube-api-access-8s6n7\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.020551 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.020563 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de093257-3d61-4df8-86af-1000c1964ff3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.060051 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: E0201 08:47:28.061373 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 08:47:28 crc kubenswrapper[5127]: E0201 08:47:28.062449 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 08:47:28 crc kubenswrapper[5127]: E0201 08:47:28.063432 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 08:47:28 crc kubenswrapper[5127]: E0201 08:47:28.063462 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="b90f8b5c-58e6-48e5-af62-2e0736a6895f" containerName="nova-cell0-conductor-conductor" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.225078 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-combined-ca-bundle\") pod \"7181dbcd-6372-4645-8426-47aaaf3eb576\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.225187 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-config-data\") pod \"7181dbcd-6372-4645-8426-47aaaf3eb576\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.225224 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwtdm\" (UniqueName: \"kubernetes.io/projected/7181dbcd-6372-4645-8426-47aaaf3eb576-kube-api-access-qwtdm\") pod \"7181dbcd-6372-4645-8426-47aaaf3eb576\" (UID: \"7181dbcd-6372-4645-8426-47aaaf3eb576\") " Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.229410 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7181dbcd-6372-4645-8426-47aaaf3eb576-kube-api-access-qwtdm" (OuterVolumeSpecName: "kube-api-access-qwtdm") pod "7181dbcd-6372-4645-8426-47aaaf3eb576" (UID: "7181dbcd-6372-4645-8426-47aaaf3eb576"). InnerVolumeSpecName "kube-api-access-qwtdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.247078 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7181dbcd-6372-4645-8426-47aaaf3eb576" (UID: "7181dbcd-6372-4645-8426-47aaaf3eb576"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.248396 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-config-data" (OuterVolumeSpecName: "config-data") pod "7181dbcd-6372-4645-8426-47aaaf3eb576" (UID: "7181dbcd-6372-4645-8426-47aaaf3eb576"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.327637 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.327678 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7181dbcd-6372-4645-8426-47aaaf3eb576-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.327689 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwtdm\" (UniqueName: \"kubernetes.io/projected/7181dbcd-6372-4645-8426-47aaaf3eb576-kube-api-access-qwtdm\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.464183 5127 generic.go:334] "Generic (PLEG): container finished" podID="7181dbcd-6372-4645-8426-47aaaf3eb576" containerID="7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5" exitCode=0 Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.464249 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7181dbcd-6372-4645-8426-47aaaf3eb576","Type":"ContainerDied","Data":"7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5"} Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.464288 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.465063 5127 scope.go:117] "RemoveContainer" containerID="7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.465038 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7181dbcd-6372-4645-8426-47aaaf3eb576","Type":"ContainerDied","Data":"7e87e3268dddae25b828c71e9b209aaff42b4a7a4a3165d54bd226e4b54c7ecc"} Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.466589 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"de093257-3d61-4df8-86af-1000c1964ff3","Type":"ContainerDied","Data":"2b76dd1c63e8192dd56248645321696e6f8eab7ff20edb8cb47c5fe1734e4be1"} Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.466614 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.494245 5127 scope.go:117] "RemoveContainer" containerID="7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5" Feb 01 08:47:28 crc kubenswrapper[5127]: E0201 08:47:28.494829 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5\": container with ID starting with 7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5 not found: ID does not exist" containerID="7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.494898 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5"} err="failed to get container status \"7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5\": rpc error: code = NotFound desc = could not find container \"7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5\": container with ID starting with 7574593747affbd7b96e1d536f8a4840bc3759bd8834f7558368a74183f479c5 not found: ID does not exist" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.494959 5127 scope.go:117] "RemoveContainer" containerID="01e7a40c09a8d30251a233d1c1a65779b4fdcc33891f92d5910592c994dcc59d" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.497826 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.508671 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.517948 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.529660 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.539257 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: E0201 08:47:28.539742 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de093257-3d61-4df8-86af-1000c1964ff3" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.539763 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="de093257-3d61-4df8-86af-1000c1964ff3" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 08:47:28 crc kubenswrapper[5127]: E0201 08:47:28.539781 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7181dbcd-6372-4645-8426-47aaaf3eb576" containerName="nova-scheduler-scheduler" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.539788 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7181dbcd-6372-4645-8426-47aaaf3eb576" containerName="nova-scheduler-scheduler" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.539969 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7181dbcd-6372-4645-8426-47aaaf3eb576" containerName="nova-scheduler-scheduler" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.539985 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="de093257-3d61-4df8-86af-1000c1964ff3" containerName="nova-cell1-novncproxy-novncproxy" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.540697 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.546308 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.549064 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.550540 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.553185 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.558201 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.613324 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.734190 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.734249 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh85k\" (UniqueName: \"kubernetes.io/projected/ec2f810d-1f20-4378-ba82-cb5630da7544-kube-api-access-gh85k\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.734298 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdgbz\" (UniqueName: \"kubernetes.io/projected/38794c77-9e2e-450a-832c-5e913b09350a-kube-api-access-gdgbz\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.734817 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38794c77-9e2e-450a-832c-5e913b09350a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.735115 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-config-data\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.735184 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38794c77-9e2e-450a-832c-5e913b09350a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.837001 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdgbz\" (UniqueName: \"kubernetes.io/projected/38794c77-9e2e-450a-832c-5e913b09350a-kube-api-access-gdgbz\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.837110 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38794c77-9e2e-450a-832c-5e913b09350a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.837162 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-config-data\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.837188 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38794c77-9e2e-450a-832c-5e913b09350a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.837221 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.837242 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh85k\" (UniqueName: \"kubernetes.io/projected/ec2f810d-1f20-4378-ba82-cb5630da7544-kube-api-access-gh85k\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.844290 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.845370 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38794c77-9e2e-450a-832c-5e913b09350a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.854258 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-config-data\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.855106 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38794c77-9e2e-450a-832c-5e913b09350a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.858110 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh85k\" (UniqueName: \"kubernetes.io/projected/ec2f810d-1f20-4378-ba82-cb5630da7544-kube-api-access-gh85k\") pod \"nova-scheduler-0\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " pod="openstack/nova-scheduler-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.869992 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdgbz\" (UniqueName: \"kubernetes.io/projected/38794c77-9e2e-450a-832c-5e913b09350a-kube-api-access-gdgbz\") pod \"nova-cell1-novncproxy-0\" (UID: \"38794c77-9e2e-450a-832c-5e913b09350a\") " pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.887465 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:28 crc kubenswrapper[5127]: I0201 08:47:28.894996 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 08:47:29 crc kubenswrapper[5127]: I0201 08:47:29.350870 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 01 08:47:29 crc kubenswrapper[5127]: I0201 08:47:29.471682 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 08:47:29 crc kubenswrapper[5127]: I0201 08:47:29.480787 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38794c77-9e2e-450a-832c-5e913b09350a","Type":"ContainerStarted","Data":"6dc7ece2e3b27f7747492a3ba5fdfbe1b17b8b64f02eaa3dc325e9f34b255f35"} Feb 01 08:47:29 crc kubenswrapper[5127]: W0201 08:47:29.490070 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec2f810d_1f20_4378_ba82_cb5630da7544.slice/crio-2170a4ab77296e373c7657d42102a4a36642790258eec50d5dda4b89a8c53ad7 WatchSource:0}: Error finding container 2170a4ab77296e373c7657d42102a4a36642790258eec50d5dda4b89a8c53ad7: Status 404 returned error can't find the container with id 2170a4ab77296e373c7657d42102a4a36642790258eec50d5dda4b89a8c53ad7 Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.265875 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7181dbcd-6372-4645-8426-47aaaf3eb576" path="/var/lib/kubelet/pods/7181dbcd-6372-4645-8426-47aaaf3eb576/volumes" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.267148 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de093257-3d61-4df8-86af-1000c1964ff3" path="/var/lib/kubelet/pods/de093257-3d61-4df8-86af-1000c1964ff3/volumes" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.514353 5127 generic.go:334] "Generic (PLEG): container finished" podID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerID="9a038261d3adb8fa12352ac4b502ae568f5da97378f1c7d25ba7ecd7d1d8076c" exitCode=0 Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.514736 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb","Type":"ContainerDied","Data":"9a038261d3adb8fa12352ac4b502ae568f5da97378f1c7d25ba7ecd7d1d8076c"} Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.516067 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"38794c77-9e2e-450a-832c-5e913b09350a","Type":"ContainerStarted","Data":"ad43932e9278c9bb65e66062aee8eb4e33e85a46a813f7ba1fd607a483f5477a"} Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.521897 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec2f810d-1f20-4378-ba82-cb5630da7544","Type":"ContainerStarted","Data":"799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c"} Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.521936 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec2f810d-1f20-4378-ba82-cb5630da7544","Type":"ContainerStarted","Data":"2170a4ab77296e373c7657d42102a4a36642790258eec50d5dda4b89a8c53ad7"} Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.525860 5127 generic.go:334] "Generic (PLEG): container finished" podID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerID="acf340decbd8b5742ec6c8762fe554f7abeb6f14bbbca5fd0a5dd93f6cd00ff7" exitCode=0 Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.525895 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64c5b944-c0b8-4afe-a270-25911efaf8fb","Type":"ContainerDied","Data":"acf340decbd8b5742ec6c8762fe554f7abeb6f14bbbca5fd0a5dd93f6cd00ff7"} Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.543318 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.543300912 podStartE2EDuration="2.543300912s" podCreationTimestamp="2026-02-01 08:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:30.539621363 +0000 UTC m=+7201.025523726" watchObservedRunningTime="2026-02-01 08:47:30.543300912 +0000 UTC m=+7201.029203275" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.588399 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.588345302 podStartE2EDuration="2.588345302s" podCreationTimestamp="2026-02-01 08:47:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:30.584835268 +0000 UTC m=+7201.070737621" watchObservedRunningTime="2026-02-01 08:47:30.588345302 +0000 UTC m=+7201.074247665" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.635918 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.743042 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.806453 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-logs\") pod \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.806540 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-combined-ca-bundle\") pod \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.806649 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml7ts\" (UniqueName: \"kubernetes.io/projected/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-kube-api-access-ml7ts\") pod \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.806703 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-config-data\") pod \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\" (UID: \"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.806960 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-logs" (OuterVolumeSpecName: "logs") pod "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" (UID: "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.807401 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.816685 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-kube-api-access-ml7ts" (OuterVolumeSpecName: "kube-api-access-ml7ts") pod "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" (UID: "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb"). InnerVolumeSpecName "kube-api-access-ml7ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.846783 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" (UID: "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.847446 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-config-data" (OuterVolumeSpecName: "config-data") pod "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" (UID: "8ef2d54a-e179-42ed-8a5e-60a0ec1606bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.909401 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-config-data\") pod \"64c5b944-c0b8-4afe-a270-25911efaf8fb\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.909471 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjr9c\" (UniqueName: \"kubernetes.io/projected/64c5b944-c0b8-4afe-a270-25911efaf8fb-kube-api-access-xjr9c\") pod \"64c5b944-c0b8-4afe-a270-25911efaf8fb\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.909731 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64c5b944-c0b8-4afe-a270-25911efaf8fb-logs\") pod \"64c5b944-c0b8-4afe-a270-25911efaf8fb\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.912111 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64c5b944-c0b8-4afe-a270-25911efaf8fb-logs" (OuterVolumeSpecName: "logs") pod "64c5b944-c0b8-4afe-a270-25911efaf8fb" (UID: "64c5b944-c0b8-4afe-a270-25911efaf8fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.912355 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-combined-ca-bundle\") pod \"64c5b944-c0b8-4afe-a270-25911efaf8fb\" (UID: \"64c5b944-c0b8-4afe-a270-25911efaf8fb\") " Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.913347 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.913374 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml7ts\" (UniqueName: \"kubernetes.io/projected/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-kube-api-access-ml7ts\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.913387 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.913398 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64c5b944-c0b8-4afe-a270-25911efaf8fb-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.915480 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c5b944-c0b8-4afe-a270-25911efaf8fb-kube-api-access-xjr9c" (OuterVolumeSpecName: "kube-api-access-xjr9c") pod "64c5b944-c0b8-4afe-a270-25911efaf8fb" (UID: "64c5b944-c0b8-4afe-a270-25911efaf8fb"). InnerVolumeSpecName "kube-api-access-xjr9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.952174 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64c5b944-c0b8-4afe-a270-25911efaf8fb" (UID: "64c5b944-c0b8-4afe-a270-25911efaf8fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:30 crc kubenswrapper[5127]: I0201 08:47:30.959753 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-config-data" (OuterVolumeSpecName: "config-data") pod "64c5b944-c0b8-4afe-a270-25911efaf8fb" (UID: "64c5b944-c0b8-4afe-a270-25911efaf8fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.014378 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.014412 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64c5b944-c0b8-4afe-a270-25911efaf8fb-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.014423 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjr9c\" (UniqueName: \"kubernetes.io/projected/64c5b944-c0b8-4afe-a270-25911efaf8fb-kube-api-access-xjr9c\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.553521 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6 is running failed: container process not found" containerID="6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.554595 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6 is running failed: container process not found" containerID="6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.555217 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6 is running failed: container process not found" containerID="6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.555459 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="89951ae8-f890-4ce2-9146-fed7435253c5" containerName="nova-cell1-conductor-conductor" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.607505 5127 generic.go:334] "Generic (PLEG): container finished" podID="89951ae8-f890-4ce2-9146-fed7435253c5" containerID="6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6" exitCode=0 Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.607623 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"89951ae8-f890-4ce2-9146-fed7435253c5","Type":"ContainerDied","Data":"6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6"} Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.619417 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"64c5b944-c0b8-4afe-a270-25911efaf8fb","Type":"ContainerDied","Data":"b11e74ac2fd8b4f1940c63ca7099538a3684d22ff48e7f570205b42003f22ab8"} Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.619462 5127 scope.go:117] "RemoveContainer" containerID="acf340decbd8b5742ec6c8762fe554f7abeb6f14bbbca5fd0a5dd93f6cd00ff7" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.619619 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.628913 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ef2d54a-e179-42ed-8a5e-60a0ec1606bb","Type":"ContainerDied","Data":"90f4b6747ef929142f1c8e877c5a37034141e6c5acdfe2f01384073c978ac4ee"} Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.629041 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.711332 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.718033 5127 scope.go:117] "RemoveContainer" containerID="1bc54232db9b38feab09634736f918444e1fda9bb98ff872bc6db9be6f7ec4c8" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.737324 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v7mn\" (UniqueName: \"kubernetes.io/projected/89951ae8-f890-4ce2-9146-fed7435253c5-kube-api-access-8v7mn\") pod \"89951ae8-f890-4ce2-9146-fed7435253c5\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.737433 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-combined-ca-bundle\") pod \"89951ae8-f890-4ce2-9146-fed7435253c5\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.737472 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-config-data\") pod \"89951ae8-f890-4ce2-9146-fed7435253c5\" (UID: \"89951ae8-f890-4ce2-9146-fed7435253c5\") " Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.744468 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89951ae8-f890-4ce2-9146-fed7435253c5-kube-api-access-8v7mn" (OuterVolumeSpecName: "kube-api-access-8v7mn") pod "89951ae8-f890-4ce2-9146-fed7435253c5" (UID: "89951ae8-f890-4ce2-9146-fed7435253c5"). InnerVolumeSpecName "kube-api-access-8v7mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.746742 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.759528 5127 scope.go:117] "RemoveContainer" containerID="9a038261d3adb8fa12352ac4b502ae568f5da97378f1c7d25ba7ecd7d1d8076c" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.777620 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.786595 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.797086 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.798814 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89951ae8-f890-4ce2-9146-fed7435253c5" (UID: "89951ae8-f890-4ce2-9146-fed7435253c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.808831 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.809261 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89951ae8-f890-4ce2-9146-fed7435253c5" containerName="nova-cell1-conductor-conductor" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809274 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="89951ae8-f890-4ce2-9146-fed7435253c5" containerName="nova-cell1-conductor-conductor" Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.809295 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-metadata" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809303 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-metadata" Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.809317 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-log" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809324 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-log" Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.809348 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-log" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809358 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-log" Feb 01 08:47:31 crc kubenswrapper[5127]: E0201 08:47:31.809373 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-api" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809382 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-api" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809620 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-log" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809641 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-log" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809660 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" containerName="nova-metadata-metadata" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809678 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" containerName="nova-api-api" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.809693 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="89951ae8-f890-4ce2-9146-fed7435253c5" containerName="nova-cell1-conductor-conductor" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.810788 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.813347 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.831277 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-config-data" (OuterVolumeSpecName: "config-data") pod "89951ae8-f890-4ce2-9146-fed7435253c5" (UID: "89951ae8-f890-4ce2-9146-fed7435253c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.831560 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.833756 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.838009 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.839783 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d13d57-17e3-4d77-8cfe-30c383444cf7-logs\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.839861 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptjl9\" (UniqueName: \"kubernetes.io/projected/5b561302-0463-490e-a011-e508d0f4e612-kube-api-access-ptjl9\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.839901 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b561302-0463-490e-a011-e508d0f4e612-logs\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.839922 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.839942 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-config-data\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.839959 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-config-data\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.840214 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8mz\" (UniqueName: \"kubernetes.io/projected/47d13d57-17e3-4d77-8cfe-30c383444cf7-kube-api-access-jd8mz\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.840253 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.840375 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v7mn\" (UniqueName: \"kubernetes.io/projected/89951ae8-f890-4ce2-9146-fed7435253c5-kube-api-access-8v7mn\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.840387 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.840398 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89951ae8-f890-4ce2-9146-fed7435253c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.844975 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.847306 5127 scope.go:117] "RemoveContainer" containerID="2c5cac0f071d6df627852ba9752921bb000b427c77fcfb3a6dbd59e20a17ced4" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.857727 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942546 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d13d57-17e3-4d77-8cfe-30c383444cf7-logs\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942681 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptjl9\" (UniqueName: \"kubernetes.io/projected/5b561302-0463-490e-a011-e508d0f4e612-kube-api-access-ptjl9\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942728 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b561302-0463-490e-a011-e508d0f4e612-logs\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942760 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942798 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-config-data\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942822 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-config-data\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942881 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8mz\" (UniqueName: \"kubernetes.io/projected/47d13d57-17e3-4d77-8cfe-30c383444cf7-kube-api-access-jd8mz\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.942899 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.944185 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d13d57-17e3-4d77-8cfe-30c383444cf7-logs\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.944398 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b561302-0463-490e-a011-e508d0f4e612-logs\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.948185 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.948255 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.948558 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-config-data\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.949632 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-config-data\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.963907 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8mz\" (UniqueName: \"kubernetes.io/projected/47d13d57-17e3-4d77-8cfe-30c383444cf7-kube-api-access-jd8mz\") pod \"nova-api-0\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " pod="openstack/nova-api-0" Feb 01 08:47:31 crc kubenswrapper[5127]: I0201 08:47:31.964515 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptjl9\" (UniqueName: \"kubernetes.io/projected/5b561302-0463-490e-a011-e508d0f4e612-kube-api-access-ptjl9\") pod \"nova-metadata-0\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " pod="openstack/nova-metadata-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.160369 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.194070 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.263603 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c5b944-c0b8-4afe-a270-25911efaf8fb" path="/var/lib/kubelet/pods/64c5b944-c0b8-4afe-a270-25911efaf8fb/volumes" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.264602 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef2d54a-e179-42ed-8a5e-60a0ec1606bb" path="/var/lib/kubelet/pods/8ef2d54a-e179-42ed-8a5e-60a0ec1606bb/volumes" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.313964 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.356621 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-combined-ca-bundle\") pod \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.356932 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g4sx\" (UniqueName: \"kubernetes.io/projected/b90f8b5c-58e6-48e5-af62-2e0736a6895f-kube-api-access-6g4sx\") pod \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.357058 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-config-data\") pod \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\" (UID: \"b90f8b5c-58e6-48e5-af62-2e0736a6895f\") " Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.364565 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90f8b5c-58e6-48e5-af62-2e0736a6895f-kube-api-access-6g4sx" (OuterVolumeSpecName: "kube-api-access-6g4sx") pod "b90f8b5c-58e6-48e5-af62-2e0736a6895f" (UID: "b90f8b5c-58e6-48e5-af62-2e0736a6895f"). InnerVolumeSpecName "kube-api-access-6g4sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.395785 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b90f8b5c-58e6-48e5-af62-2e0736a6895f" (UID: "b90f8b5c-58e6-48e5-af62-2e0736a6895f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.405785 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-config-data" (OuterVolumeSpecName: "config-data") pod "b90f8b5c-58e6-48e5-af62-2e0736a6895f" (UID: "b90f8b5c-58e6-48e5-af62-2e0736a6895f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.460134 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.460179 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g4sx\" (UniqueName: \"kubernetes.io/projected/b90f8b5c-58e6-48e5-af62-2e0736a6895f-kube-api-access-6g4sx\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.460195 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90f8b5c-58e6-48e5-af62-2e0736a6895f-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.637722 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.637747 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"89951ae8-f890-4ce2-9146-fed7435253c5","Type":"ContainerDied","Data":"ce8be4eb6e4cf5d5aaa559ad891d00fd28c6b9b2087d58de77dc2b36ced0bf17"} Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.637811 5127 scope.go:117] "RemoveContainer" containerID="6dd1f20cb6f0f676b38ac0b50b1cba6255931621aea9e5963fba64fe814700f6" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.650922 5127 generic.go:334] "Generic (PLEG): container finished" podID="b90f8b5c-58e6-48e5-af62-2e0736a6895f" containerID="39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" exitCode=0 Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.651032 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b90f8b5c-58e6-48e5-af62-2e0736a6895f","Type":"ContainerDied","Data":"39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049"} Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.651081 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b90f8b5c-58e6-48e5-af62-2e0736a6895f","Type":"ContainerDied","Data":"67e29dd4b9a5479a7cad76234db849aeda07116ce2a476ae160da81df449f391"} Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.651167 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.669648 5127 scope.go:117] "RemoveContainer" containerID="39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.672048 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.691063 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.707234 5127 scope.go:117] "RemoveContainer" containerID="39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.707407 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: E0201 08:47:32.708130 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90f8b5c-58e6-48e5-af62-2e0736a6895f" containerName="nova-cell0-conductor-conductor" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.708161 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90f8b5c-58e6-48e5-af62-2e0736a6895f" containerName="nova-cell0-conductor-conductor" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.708429 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90f8b5c-58e6-48e5-af62-2e0736a6895f" containerName="nova-cell0-conductor-conductor" Feb 01 08:47:32 crc kubenswrapper[5127]: E0201 08:47:32.720864 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049\": container with ID starting with 39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049 not found: ID does not exist" containerID="39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.720920 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049"} err="failed to get container status \"39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049\": rpc error: code = NotFound desc = could not find container \"39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049\": container with ID starting with 39256444b252d55dbe0f13bbc65d179a05e4e0788b8eac793619dcb6bd442049 not found: ID does not exist" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.722313 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.726558 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.734775 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.758428 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.768211 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.768275 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.768343 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4q7v\" (UniqueName: \"kubernetes.io/projected/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-kube-api-access-j4q7v\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.768466 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.781801 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.797799 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.799644 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.803150 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.828342 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.869960 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.870271 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmssb\" (UniqueName: \"kubernetes.io/projected/e4ccca7b-584b-4aa9-badd-0438284cfa51-kube-api-access-rmssb\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.870453 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.870994 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.871830 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.872161 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4q7v\" (UniqueName: \"kubernetes.io/projected/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-kube-api-access-j4q7v\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: W0201 08:47:32.872537 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47d13d57_17e3_4d77_8cfe_30c383444cf7.slice/crio-b800c4007e59115e6594a105c1f77b232d5e51975918eb599ec776f4b09879bb WatchSource:0}: Error finding container b800c4007e59115e6594a105c1f77b232d5e51975918eb599ec776f4b09879bb: Status 404 returned error can't find the container with id b800c4007e59115e6594a105c1f77b232d5e51975918eb599ec776f4b09879bb Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.878708 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.879609 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.885113 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.900139 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4q7v\" (UniqueName: \"kubernetes.io/projected/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-kube-api-access-j4q7v\") pod \"nova-cell1-conductor-0\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.974073 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.974123 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmssb\" (UniqueName: \"kubernetes.io/projected/e4ccca7b-584b-4aa9-badd-0438284cfa51-kube-api-access-rmssb\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.974186 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.978682 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.982205 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:32 crc kubenswrapper[5127]: I0201 08:47:32.996149 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmssb\" (UniqueName: \"kubernetes.io/projected/e4ccca7b-584b-4aa9-badd-0438284cfa51-kube-api-access-rmssb\") pod \"nova-cell0-conductor-0\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.074875 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.153163 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.665326 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b561302-0463-490e-a011-e508d0f4e612","Type":"ContainerStarted","Data":"4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c"} Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.665884 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b561302-0463-490e-a011-e508d0f4e612","Type":"ContainerStarted","Data":"a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91"} Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.665906 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b561302-0463-490e-a011-e508d0f4e612","Type":"ContainerStarted","Data":"a119009c5f8a8b3043bd7c09bed62cfda0b2f60bea00e62c344e10a9b968355a"} Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.668428 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d13d57-17e3-4d77-8cfe-30c383444cf7","Type":"ContainerStarted","Data":"c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb"} Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.668472 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d13d57-17e3-4d77-8cfe-30c383444cf7","Type":"ContainerStarted","Data":"03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533"} Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.668483 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d13d57-17e3-4d77-8cfe-30c383444cf7","Type":"ContainerStarted","Data":"b800c4007e59115e6594a105c1f77b232d5e51975918eb599ec776f4b09879bb"} Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.694151 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.694127913 podStartE2EDuration="2.694127913s" podCreationTimestamp="2026-02-01 08:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:33.686049426 +0000 UTC m=+7204.171951789" watchObservedRunningTime="2026-02-01 08:47:33.694127913 +0000 UTC m=+7204.180030276" Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.719876 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.719853234 podStartE2EDuration="2.719853234s" podCreationTimestamp="2026-02-01 08:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:33.70519133 +0000 UTC m=+7204.191093693" watchObservedRunningTime="2026-02-01 08:47:33.719853234 +0000 UTC m=+7204.205755597" Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.759884 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.892794 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.895364 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 08:47:33 crc kubenswrapper[5127]: W0201 08:47:33.910162 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4ccca7b_584b_4aa9_badd_0438284cfa51.slice/crio-cd61e0d69165ac8490d03a543348f99537a89d8ceda4077e136b2b87a21153e0 WatchSource:0}: Error finding container cd61e0d69165ac8490d03a543348f99537a89d8ceda4077e136b2b87a21153e0: Status 404 returned error can't find the container with id cd61e0d69165ac8490d03a543348f99537a89d8ceda4077e136b2b87a21153e0 Feb 01 08:47:33 crc kubenswrapper[5127]: I0201 08:47:33.914060 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.096948 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.182712 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cc7954dc-ql6pq"] Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.183268 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" podUID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerName="dnsmasq-dns" containerID="cri-o://2865fa79137da1928bf6ad7777870c0f2dbd18d0eb1728e079e44304677ebe08" gracePeriod=10 Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.249428 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89951ae8-f890-4ce2-9146-fed7435253c5" path="/var/lib/kubelet/pods/89951ae8-f890-4ce2-9146-fed7435253c5/volumes" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.251538 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90f8b5c-58e6-48e5-af62-2e0736a6895f" path="/var/lib/kubelet/pods/b90f8b5c-58e6-48e5-af62-2e0736a6895f/volumes" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.691173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e4ccca7b-584b-4aa9-badd-0438284cfa51","Type":"ContainerStarted","Data":"f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013"} Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.691663 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e4ccca7b-584b-4aa9-badd-0438284cfa51","Type":"ContainerStarted","Data":"cd61e0d69165ac8490d03a543348f99537a89d8ceda4077e136b2b87a21153e0"} Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.692939 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.705791 5127 generic.go:334] "Generic (PLEG): container finished" podID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerID="2865fa79137da1928bf6ad7777870c0f2dbd18d0eb1728e079e44304677ebe08" exitCode=0 Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.705866 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" event={"ID":"e11ed41e-754f-46f4-af29-99cef23e3ef6","Type":"ContainerDied","Data":"2865fa79137da1928bf6ad7777870c0f2dbd18d0eb1728e079e44304677ebe08"} Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.722652 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f1d4e0b-4c49-4323-b3a7-48363d831f2b","Type":"ContainerStarted","Data":"d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c"} Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.722697 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f1d4e0b-4c49-4323-b3a7-48363d831f2b","Type":"ContainerStarted","Data":"b3ccb39f4383ab9f59bf89dcfa6fce6b12cf351501be5cbef267e9d9af71b089"} Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.723262 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.751397 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.751365816 podStartE2EDuration="2.751365816s" podCreationTimestamp="2026-02-01 08:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:34.729266932 +0000 UTC m=+7205.215169295" watchObservedRunningTime="2026-02-01 08:47:34.751365816 +0000 UTC m=+7205.237268179" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.760440 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.770179 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.770151501 podStartE2EDuration="2.770151501s" podCreationTimestamp="2026-02-01 08:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:47:34.758061516 +0000 UTC m=+7205.243963879" watchObservedRunningTime="2026-02-01 08:47:34.770151501 +0000 UTC m=+7205.256053864" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.845750 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-config\") pod \"e11ed41e-754f-46f4-af29-99cef23e3ef6\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.845804 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-sb\") pod \"e11ed41e-754f-46f4-af29-99cef23e3ef6\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.845843 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnthr\" (UniqueName: \"kubernetes.io/projected/e11ed41e-754f-46f4-af29-99cef23e3ef6-kube-api-access-bnthr\") pod \"e11ed41e-754f-46f4-af29-99cef23e3ef6\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.845862 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-dns-svc\") pod \"e11ed41e-754f-46f4-af29-99cef23e3ef6\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.845987 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-nb\") pod \"e11ed41e-754f-46f4-af29-99cef23e3ef6\" (UID: \"e11ed41e-754f-46f4-af29-99cef23e3ef6\") " Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.863870 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e11ed41e-754f-46f4-af29-99cef23e3ef6-kube-api-access-bnthr" (OuterVolumeSpecName: "kube-api-access-bnthr") pod "e11ed41e-754f-46f4-af29-99cef23e3ef6" (UID: "e11ed41e-754f-46f4-af29-99cef23e3ef6"). InnerVolumeSpecName "kube-api-access-bnthr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.907546 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e11ed41e-754f-46f4-af29-99cef23e3ef6" (UID: "e11ed41e-754f-46f4-af29-99cef23e3ef6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.912796 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e11ed41e-754f-46f4-af29-99cef23e3ef6" (UID: "e11ed41e-754f-46f4-af29-99cef23e3ef6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.919934 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e11ed41e-754f-46f4-af29-99cef23e3ef6" (UID: "e11ed41e-754f-46f4-af29-99cef23e3ef6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.940246 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-config" (OuterVolumeSpecName: "config") pod "e11ed41e-754f-46f4-af29-99cef23e3ef6" (UID: "e11ed41e-754f-46f4-af29-99cef23e3ef6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.953553 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.953621 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.953639 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.953653 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnthr\" (UniqueName: \"kubernetes.io/projected/e11ed41e-754f-46f4-af29-99cef23e3ef6-kube-api-access-bnthr\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:34 crc kubenswrapper[5127]: I0201 08:47:34.953669 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e11ed41e-754f-46f4-af29-99cef23e3ef6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:47:35 crc kubenswrapper[5127]: I0201 08:47:35.732721 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" event={"ID":"e11ed41e-754f-46f4-af29-99cef23e3ef6","Type":"ContainerDied","Data":"99952e5c046f5289b57f93355c7b7290adc99f23163ddd3f9476dfc5e3277a08"} Feb 01 08:47:35 crc kubenswrapper[5127]: I0201 08:47:35.734684 5127 scope.go:117] "RemoveContainer" containerID="2865fa79137da1928bf6ad7777870c0f2dbd18d0eb1728e079e44304677ebe08" Feb 01 08:47:35 crc kubenswrapper[5127]: I0201 08:47:35.732893 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cc7954dc-ql6pq" Feb 01 08:47:35 crc kubenswrapper[5127]: I0201 08:47:35.773119 5127 scope.go:117] "RemoveContainer" containerID="28497879c600eafa78e23dffaf3bb27d9c09a425ec939134d138657831d423cd" Feb 01 08:47:35 crc kubenswrapper[5127]: I0201 08:47:35.779895 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cc7954dc-ql6pq"] Feb 01 08:47:35 crc kubenswrapper[5127]: I0201 08:47:35.802213 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cc7954dc-ql6pq"] Feb 01 08:47:36 crc kubenswrapper[5127]: I0201 08:47:36.124333 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 01 08:47:36 crc kubenswrapper[5127]: I0201 08:47:36.249677 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e11ed41e-754f-46f4-af29-99cef23e3ef6" path="/var/lib/kubelet/pods/e11ed41e-754f-46f4-af29-99cef23e3ef6/volumes" Feb 01 08:47:36 crc kubenswrapper[5127]: I0201 08:47:36.740868 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:47:36 crc kubenswrapper[5127]: I0201 08:47:36.741758 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:47:37 crc kubenswrapper[5127]: I0201 08:47:37.162042 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:47:37 crc kubenswrapper[5127]: I0201 08:47:37.162109 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 08:47:38 crc kubenswrapper[5127]: I0201 08:47:38.888103 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:38 crc kubenswrapper[5127]: I0201 08:47:38.896906 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 08:47:38 crc kubenswrapper[5127]: I0201 08:47:38.910988 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:38 crc kubenswrapper[5127]: I0201 08:47:38.951505 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 08:47:39 crc kubenswrapper[5127]: I0201 08:47:39.785048 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 01 08:47:39 crc kubenswrapper[5127]: I0201 08:47:39.844142 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 08:47:42 crc kubenswrapper[5127]: I0201 08:47:42.161749 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 08:47:42 crc kubenswrapper[5127]: I0201 08:47:42.162327 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 08:47:42 crc kubenswrapper[5127]: I0201 08:47:42.194326 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 08:47:42 crc kubenswrapper[5127]: I0201 08:47:42.194397 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 08:47:43 crc kubenswrapper[5127]: I0201 08:47:43.107294 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 01 08:47:43 crc kubenswrapper[5127]: I0201 08:47:43.183860 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 01 08:47:43 crc kubenswrapper[5127]: I0201 08:47:43.202846 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.93:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:47:43 crc kubenswrapper[5127]: I0201 08:47:43.326871 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.94:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:47:43 crc kubenswrapper[5127]: I0201 08:47:43.326947 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.93:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:47:43 crc kubenswrapper[5127]: I0201 08:47:43.326998 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.94:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.165198 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.165985 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.168501 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.168721 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.203114 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.205885 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.208750 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.209057 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.945859 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 08:47:52 crc kubenswrapper[5127]: I0201 08:47:52.956478 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.028922 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:47:58 crc kubenswrapper[5127]: E0201 08:47:58.030075 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerName="init" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.030093 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerName="init" Feb 01 08:47:58 crc kubenswrapper[5127]: E0201 08:47:58.030112 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerName="dnsmasq-dns" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.030120 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerName="dnsmasq-dns" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.030322 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e11ed41e-754f-46f4-af29-99cef23e3ef6" containerName="dnsmasq-dns" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.031623 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.036004 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.080444 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.080513 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.080624 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.080696 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.080721 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjm86\" (UniqueName: \"kubernetes.io/projected/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-kube-api-access-vjm86\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.080750 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.080891 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.182298 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.182379 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.182470 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.182546 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.182651 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjm86\" (UniqueName: \"kubernetes.io/projected/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-kube-api-access-vjm86\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.182704 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.182875 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.191128 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.192256 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.194416 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.195222 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.205729 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjm86\" (UniqueName: \"kubernetes.io/projected/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-kube-api-access-vjm86\") pod \"cinder-scheduler-0\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.365320 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 08:47:58 crc kubenswrapper[5127]: I0201 08:47:58.899220 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:47:59 crc kubenswrapper[5127]: I0201 08:47:59.010923 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ebdbf4d-3544-4f3f-bf77-36e932fc887d","Type":"ContainerStarted","Data":"f97e85eb370639890664f4a3aef49c6b259dba4a5f1fc5a5a271dcaf14771e8c"} Feb 01 08:47:59 crc kubenswrapper[5127]: I0201 08:47:59.918242 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:47:59 crc kubenswrapper[5127]: I0201 08:47:59.920042 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api-log" containerID="cri-o://4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee" gracePeriod=30 Feb 01 08:47:59 crc kubenswrapper[5127]: I0201 08:47:59.920563 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api" containerID="cri-o://dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21" gracePeriod=30 Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.022547 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ebdbf4d-3544-4f3f-bf77-36e932fc887d","Type":"ContainerStarted","Data":"3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d"} Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.430422 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.432569 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.435958 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.444907 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629557 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629617 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629753 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-sys\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629797 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqql6\" (UniqueName: \"kubernetes.io/projected/2a245e44-99f1-49ba-b15e-bb4ffd755769-kube-api-access-cqql6\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629823 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629844 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629868 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629917 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629939 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-run\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629963 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.629995 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a245e44-99f1-49ba-b15e-bb4ffd755769-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.630126 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.630194 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.630264 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-dev\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.630353 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.630458 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.731922 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.731979 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732020 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732043 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-run\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732073 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732098 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a245e44-99f1-49ba-b15e-bb4ffd755769-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732135 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732139 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732154 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732193 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-dev\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732226 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732261 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732323 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732342 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732382 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-sys\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732398 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqql6\" (UniqueName: \"kubernetes.io/projected/2a245e44-99f1-49ba-b15e-bb4ffd755769-kube-api-access-cqql6\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732415 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732459 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732496 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732500 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-dev\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732527 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732559 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-run\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732631 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732696 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.732829 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-sys\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.733031 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2a245e44-99f1-49ba-b15e-bb4ffd755769-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.736271 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2a245e44-99f1-49ba-b15e-bb4ffd755769-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.747124 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.748765 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.749223 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.749256 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a245e44-99f1-49ba-b15e-bb4ffd755769-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.749448 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqql6\" (UniqueName: \"kubernetes.io/projected/2a245e44-99f1-49ba-b15e-bb4ffd755769-kube-api-access-cqql6\") pod \"cinder-volume-volume1-0\" (UID: \"2a245e44-99f1-49ba-b15e-bb4ffd755769\") " pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:00 crc kubenswrapper[5127]: I0201 08:48:00.755557 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.052263 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ebdbf4d-3544-4f3f-bf77-36e932fc887d","Type":"ContainerStarted","Data":"039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773"} Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.064989 5127 generic.go:334] "Generic (PLEG): container finished" podID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerID="4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee" exitCode=143 Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.065078 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6aff8d68-2451-4781-82a0-d9448018ce3b","Type":"ContainerDied","Data":"4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee"} Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.081975 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.815155676 podStartE2EDuration="4.081957395s" podCreationTimestamp="2026-02-01 08:47:57 +0000 UTC" firstStartedPulling="2026-02-01 08:47:58.91175283 +0000 UTC m=+7229.397655213" lastFinishedPulling="2026-02-01 08:47:59.178554569 +0000 UTC m=+7229.664456932" observedRunningTime="2026-02-01 08:48:01.080853256 +0000 UTC m=+7231.566755619" watchObservedRunningTime="2026-02-01 08:48:01.081957395 +0000 UTC m=+7231.567859758" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.150899 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.160056 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.164527 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.173787 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.275893 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a3efe1f7-64a9-45d0-a949-536543461c61-ceph\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.275998 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-config-data\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276186 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-dev\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276244 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-sys\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276297 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276333 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276439 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-scripts\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276468 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276714 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276784 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq429\" (UniqueName: \"kubernetes.io/projected/a3efe1f7-64a9-45d0-a949-536543461c61-kube-api-access-qq429\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276855 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276912 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.276958 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-lib-modules\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.277031 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-run\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.277086 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.277127 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378602 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-run\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378680 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378703 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378728 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a3efe1f7-64a9-45d0-a949-536543461c61-ceph\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378764 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-config-data\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378765 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-nvme\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378904 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-dev\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378944 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-sys\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378977 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379006 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379088 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-scripts\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379115 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379229 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379269 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq429\" (UniqueName: \"kubernetes.io/projected/a3efe1f7-64a9-45d0-a949-536543461c61-kube-api-access-qq429\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379299 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379326 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379370 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-lib-modules\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379537 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-lib-modules\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378691 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-run\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.378765 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379601 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379650 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-dev\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379680 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379709 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379734 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.379727 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3efe1f7-64a9-45d0-a949-536543461c61-sys\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.384631 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a3efe1f7-64a9-45d0-a949-536543461c61-ceph\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.385155 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-config-data-custom\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.385759 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-config-data\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.386122 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.394195 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.397755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3efe1f7-64a9-45d0-a949-536543461c61-scripts\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.406954 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq429\" (UniqueName: \"kubernetes.io/projected/a3efe1f7-64a9-45d0-a949-536543461c61-kube-api-access-qq429\") pod \"cinder-backup-0\" (UID: \"a3efe1f7-64a9-45d0-a949-536543461c61\") " pod="openstack/cinder-backup-0" Feb 01 08:48:01 crc kubenswrapper[5127]: I0201 08:48:01.497381 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 01 08:48:02 crc kubenswrapper[5127]: I0201 08:48:02.077558 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"2a245e44-99f1-49ba-b15e-bb4ffd755769","Type":"ContainerStarted","Data":"04e8b5406011ea946bc153b414be4155fda3426cb89fab12f5fe6646a5d1508e"} Feb 01 08:48:02 crc kubenswrapper[5127]: I0201 08:48:02.079207 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"2a245e44-99f1-49ba-b15e-bb4ffd755769","Type":"ContainerStarted","Data":"cf7e9bf04bfaa99fe7d820e95717d11c6a80a997a4e5b7424450c7a83c7360e3"} Feb 01 08:48:02 crc kubenswrapper[5127]: I0201 08:48:02.228100 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.095772 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"2a245e44-99f1-49ba-b15e-bb4ffd755769","Type":"ContainerStarted","Data":"c590ffec732cdac4065c91384950bc767b0b910a5104e4fbaae2044f34799d86"} Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.105025 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a3efe1f7-64a9-45d0-a949-536543461c61","Type":"ContainerStarted","Data":"04fb963b6f67f1f41bb7a06efc91c267257bd0e14adaf9bfee584b6aac52ae06"} Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.105122 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a3efe1f7-64a9-45d0-a949-536543461c61","Type":"ContainerStarted","Data":"361b8f78fe1f378b2f32072333972ec87be4f4025175f31b1c2bf133d4a0a170"} Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.105173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"a3efe1f7-64a9-45d0-a949-536543461c61","Type":"ContainerStarted","Data":"cac800dc6f05112eb9171cc198223e7056d3fc1ab1bdad828bd0c1b6d1aff636"} Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.134506 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.732417018 podStartE2EDuration="3.134481859s" podCreationTimestamp="2026-02-01 08:48:00 +0000 UTC" firstStartedPulling="2026-02-01 08:48:01.397166184 +0000 UTC m=+7231.883068547" lastFinishedPulling="2026-02-01 08:48:01.799231025 +0000 UTC m=+7232.285133388" observedRunningTime="2026-02-01 08:48:03.124217693 +0000 UTC m=+7233.610120066" watchObservedRunningTime="2026-02-01 08:48:03.134481859 +0000 UTC m=+7233.620384222" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.173423 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=1.875617184 podStartE2EDuration="2.173396765s" podCreationTimestamp="2026-02-01 08:48:01 +0000 UTC" firstStartedPulling="2026-02-01 08:48:02.230829741 +0000 UTC m=+7232.716732104" lastFinishedPulling="2026-02-01 08:48:02.528609322 +0000 UTC m=+7233.014511685" observedRunningTime="2026-02-01 08:48:03.162396089 +0000 UTC m=+7233.648298472" watchObservedRunningTime="2026-02-01 08:48:03.173396765 +0000 UTC m=+7233.659299128" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.366156 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.571387 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.643859 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data-custom\") pod \"6aff8d68-2451-4781-82a0-d9448018ce3b\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.643918 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aff8d68-2451-4781-82a0-d9448018ce3b-etc-machine-id\") pod \"6aff8d68-2451-4781-82a0-d9448018ce3b\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.643990 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aff8d68-2451-4781-82a0-d9448018ce3b-logs\") pod \"6aff8d68-2451-4781-82a0-d9448018ce3b\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.644035 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-combined-ca-bundle\") pod \"6aff8d68-2451-4781-82a0-d9448018ce3b\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.644099 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjfrr\" (UniqueName: \"kubernetes.io/projected/6aff8d68-2451-4781-82a0-d9448018ce3b-kube-api-access-kjfrr\") pod \"6aff8d68-2451-4781-82a0-d9448018ce3b\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.644139 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-scripts\") pod \"6aff8d68-2451-4781-82a0-d9448018ce3b\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.644171 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data\") pod \"6aff8d68-2451-4781-82a0-d9448018ce3b\" (UID: \"6aff8d68-2451-4781-82a0-d9448018ce3b\") " Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.644827 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aff8d68-2451-4781-82a0-d9448018ce3b-logs" (OuterVolumeSpecName: "logs") pod "6aff8d68-2451-4781-82a0-d9448018ce3b" (UID: "6aff8d68-2451-4781-82a0-d9448018ce3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.645296 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6aff8d68-2451-4781-82a0-d9448018ce3b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6aff8d68-2451-4781-82a0-d9448018ce3b" (UID: "6aff8d68-2451-4781-82a0-d9448018ce3b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.652907 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6aff8d68-2451-4781-82a0-d9448018ce3b" (UID: "6aff8d68-2451-4781-82a0-d9448018ce3b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.653007 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aff8d68-2451-4781-82a0-d9448018ce3b-kube-api-access-kjfrr" (OuterVolumeSpecName: "kube-api-access-kjfrr") pod "6aff8d68-2451-4781-82a0-d9448018ce3b" (UID: "6aff8d68-2451-4781-82a0-d9448018ce3b"). InnerVolumeSpecName "kube-api-access-kjfrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.656991 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-scripts" (OuterVolumeSpecName: "scripts") pod "6aff8d68-2451-4781-82a0-d9448018ce3b" (UID: "6aff8d68-2451-4781-82a0-d9448018ce3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.708005 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aff8d68-2451-4781-82a0-d9448018ce3b" (UID: "6aff8d68-2451-4781-82a0-d9448018ce3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.731989 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data" (OuterVolumeSpecName: "config-data") pod "6aff8d68-2451-4781-82a0-d9448018ce3b" (UID: "6aff8d68-2451-4781-82a0-d9448018ce3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.746115 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6aff8d68-2451-4781-82a0-d9448018ce3b-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.746162 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.746174 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjfrr\" (UniqueName: \"kubernetes.io/projected/6aff8d68-2451-4781-82a0-d9448018ce3b-kube-api-access-kjfrr\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.746182 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.746190 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.746198 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6aff8d68-2451-4781-82a0-d9448018ce3b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:03 crc kubenswrapper[5127]: I0201 08:48:03.746208 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6aff8d68-2451-4781-82a0-d9448018ce3b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.119443 5127 generic.go:334] "Generic (PLEG): container finished" podID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerID="dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21" exitCode=0 Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.119498 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6aff8d68-2451-4781-82a0-d9448018ce3b","Type":"ContainerDied","Data":"dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21"} Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.119620 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.119989 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6aff8d68-2451-4781-82a0-d9448018ce3b","Type":"ContainerDied","Data":"3c376aed6e5fb9d11935db819a78d9c5cf91a6c51bf7b040d78439138abfe9bf"} Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.120062 5127 scope.go:117] "RemoveContainer" containerID="dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.164049 5127 scope.go:117] "RemoveContainer" containerID="4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.187662 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.199341 5127 scope.go:117] "RemoveContainer" containerID="dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21" Feb 01 08:48:04 crc kubenswrapper[5127]: E0201 08:48:04.199783 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21\": container with ID starting with dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21 not found: ID does not exist" containerID="dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.199823 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21"} err="failed to get container status \"dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21\": rpc error: code = NotFound desc = could not find container \"dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21\": container with ID starting with dee6fcba7a767b1a39ff65ab51bb5c7b24f311040d0e41c88b94be8fc9ce1f21 not found: ID does not exist" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.199852 5127 scope.go:117] "RemoveContainer" containerID="4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.199929 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:48:04 crc kubenswrapper[5127]: E0201 08:48:04.202183 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee\": container with ID starting with 4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee not found: ID does not exist" containerID="4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.202220 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee"} err="failed to get container status \"4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee\": rpc error: code = NotFound desc = could not find container \"4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee\": container with ID starting with 4628717cc42c58a39e6f9f4b0079d78b919c5b81e5ef7c3e7898120aa9db5dee not found: ID does not exist" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.226657 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:48:04 crc kubenswrapper[5127]: E0201 08:48:04.227070 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.227087 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api" Feb 01 08:48:04 crc kubenswrapper[5127]: E0201 08:48:04.227107 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api-log" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.227115 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api-log" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.230737 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.230770 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" containerName="cinder-api-log" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.231879 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.234367 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.257892 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aff8d68-2451-4781-82a0-d9448018ce3b" path="/var/lib/kubelet/pods/6aff8d68-2451-4781-82a0-d9448018ce3b/volumes" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.258570 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.262933 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d900cca-9100-4348-babc-9c714853bb60-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.263098 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d900cca-9100-4348-babc-9c714853bb60-logs\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.263190 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t9km\" (UniqueName: \"kubernetes.io/projected/9d900cca-9100-4348-babc-9c714853bb60-kube-api-access-6t9km\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.263225 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-config-data-custom\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.263303 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.263390 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-scripts\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.263427 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-config-data\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.364915 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d900cca-9100-4348-babc-9c714853bb60-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.365007 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d900cca-9100-4348-babc-9c714853bb60-logs\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.365053 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t9km\" (UniqueName: \"kubernetes.io/projected/9d900cca-9100-4348-babc-9c714853bb60-kube-api-access-6t9km\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.365074 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-config-data-custom\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.365111 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.365152 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-scripts\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.365175 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-config-data\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.365945 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9d900cca-9100-4348-babc-9c714853bb60-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.366345 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d900cca-9100-4348-babc-9c714853bb60-logs\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.381357 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-scripts\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.385347 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-config-data-custom\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.385999 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.386863 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d900cca-9100-4348-babc-9c714853bb60-config-data\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.399010 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t9km\" (UniqueName: \"kubernetes.io/projected/9d900cca-9100-4348-babc-9c714853bb60-kube-api-access-6t9km\") pod \"cinder-api-0\" (UID: \"9d900cca-9100-4348-babc-9c714853bb60\") " pod="openstack/cinder-api-0" Feb 01 08:48:04 crc kubenswrapper[5127]: I0201 08:48:04.555919 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 01 08:48:05 crc kubenswrapper[5127]: I0201 08:48:05.041138 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 01 08:48:05 crc kubenswrapper[5127]: I0201 08:48:05.137947 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9d900cca-9100-4348-babc-9c714853bb60","Type":"ContainerStarted","Data":"894c32aefbd40db2aac093a30c1f63c1a44aab8baa4b644e208f2fe65c856c91"} Feb 01 08:48:05 crc kubenswrapper[5127]: I0201 08:48:05.756667 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:06 crc kubenswrapper[5127]: I0201 08:48:06.157945 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9d900cca-9100-4348-babc-9c714853bb60","Type":"ContainerStarted","Data":"ff2ad0d85ec46be88e1b5eee1291cfc69ae9fbfb10d3e29f8f42f8e71f373cc2"} Feb 01 08:48:06 crc kubenswrapper[5127]: I0201 08:48:06.498944 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 01 08:48:06 crc kubenswrapper[5127]: I0201 08:48:06.740821 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:48:06 crc kubenswrapper[5127]: I0201 08:48:06.741542 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:48:06 crc kubenswrapper[5127]: I0201 08:48:06.741732 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:48:06 crc kubenswrapper[5127]: I0201 08:48:06.742475 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b64404fcebafdcf797cccad51367be97e4707a6d126b6520c7afde58af417411"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:48:06 crc kubenswrapper[5127]: I0201 08:48:06.742675 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://b64404fcebafdcf797cccad51367be97e4707a6d126b6520c7afde58af417411" gracePeriod=600 Feb 01 08:48:07 crc kubenswrapper[5127]: I0201 08:48:07.172674 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="b64404fcebafdcf797cccad51367be97e4707a6d126b6520c7afde58af417411" exitCode=0 Feb 01 08:48:07 crc kubenswrapper[5127]: I0201 08:48:07.172766 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"b64404fcebafdcf797cccad51367be97e4707a6d126b6520c7afde58af417411"} Feb 01 08:48:07 crc kubenswrapper[5127]: I0201 08:48:07.173096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf"} Feb 01 08:48:07 crc kubenswrapper[5127]: I0201 08:48:07.173123 5127 scope.go:117] "RemoveContainer" containerID="d0483d723b6675c2b20a121b5a3f911b2ac807eaa969feeacc0118bbb45ce85d" Feb 01 08:48:07 crc kubenswrapper[5127]: I0201 08:48:07.177015 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9d900cca-9100-4348-babc-9c714853bb60","Type":"ContainerStarted","Data":"51ffb1b9e23b3c1de396d52b04267a82e47ac55f805b8a121b29a5f911ba6e18"} Feb 01 08:48:07 crc kubenswrapper[5127]: I0201 08:48:07.177369 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 01 08:48:07 crc kubenswrapper[5127]: I0201 08:48:07.211918 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.211898093 podStartE2EDuration="3.211898093s" podCreationTimestamp="2026-02-01 08:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:48:07.207225198 +0000 UTC m=+7237.693127581" watchObservedRunningTime="2026-02-01 08:48:07.211898093 +0000 UTC m=+7237.697800466" Feb 01 08:48:08 crc kubenswrapper[5127]: I0201 08:48:08.674292 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 01 08:48:08 crc kubenswrapper[5127]: I0201 08:48:08.758997 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:48:09 crc kubenswrapper[5127]: I0201 08:48:09.209208 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="cinder-scheduler" containerID="cri-o://3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d" gracePeriod=30 Feb 01 08:48:09 crc kubenswrapper[5127]: I0201 08:48:09.209297 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="probe" containerID="cri-o://039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773" gracePeriod=30 Feb 01 08:48:10 crc kubenswrapper[5127]: I0201 08:48:10.229939 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ebdbf4d-3544-4f3f-bf77-36e932fc887d","Type":"ContainerDied","Data":"039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773"} Feb 01 08:48:10 crc kubenswrapper[5127]: I0201 08:48:10.230031 5127 generic.go:334] "Generic (PLEG): container finished" podID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerID="039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773" exitCode=0 Feb 01 08:48:10 crc kubenswrapper[5127]: I0201 08:48:10.980800 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.172997 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.227885 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjm86\" (UniqueName: \"kubernetes.io/projected/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-kube-api-access-vjm86\") pod \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.228355 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-combined-ca-bundle\") pod \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.228473 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-etc-machine-id\") pod \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.228530 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data\") pod \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.228561 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-scripts\") pod \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.228725 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data-custom\") pod \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\" (UID: \"6ebdbf4d-3544-4f3f-bf77-36e932fc887d\") " Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.234663 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6ebdbf4d-3544-4f3f-bf77-36e932fc887d" (UID: "6ebdbf4d-3544-4f3f-bf77-36e932fc887d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.234946 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ebdbf4d-3544-4f3f-bf77-36e932fc887d" (UID: "6ebdbf4d-3544-4f3f-bf77-36e932fc887d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.245272 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-scripts" (OuterVolumeSpecName: "scripts") pod "6ebdbf4d-3544-4f3f-bf77-36e932fc887d" (UID: "6ebdbf4d-3544-4f3f-bf77-36e932fc887d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.271949 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-kube-api-access-vjm86" (OuterVolumeSpecName: "kube-api-access-vjm86") pod "6ebdbf4d-3544-4f3f-bf77-36e932fc887d" (UID: "6ebdbf4d-3544-4f3f-bf77-36e932fc887d"). InnerVolumeSpecName "kube-api-access-vjm86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.272661 5127 generic.go:334] "Generic (PLEG): container finished" podID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerID="3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d" exitCode=0 Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.272706 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ebdbf4d-3544-4f3f-bf77-36e932fc887d","Type":"ContainerDied","Data":"3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d"} Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.272736 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ebdbf4d-3544-4f3f-bf77-36e932fc887d","Type":"ContainerDied","Data":"f97e85eb370639890664f4a3aef49c6b259dba4a5f1fc5a5a271dcaf14771e8c"} Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.272753 5127 scope.go:117] "RemoveContainer" containerID="039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.272956 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.287848 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ebdbf4d-3544-4f3f-bf77-36e932fc887d" (UID: "6ebdbf4d-3544-4f3f-bf77-36e932fc887d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.330864 5127 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.331195 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjm86\" (UniqueName: \"kubernetes.io/projected/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-kube-api-access-vjm86\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.331283 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.331415 5127 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.331563 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.342781 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data" (OuterVolumeSpecName: "config-data") pod "6ebdbf4d-3544-4f3f-bf77-36e932fc887d" (UID: "6ebdbf4d-3544-4f3f-bf77-36e932fc887d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.368275 5127 scope.go:117] "RemoveContainer" containerID="3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.387158 5127 scope.go:117] "RemoveContainer" containerID="039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773" Feb 01 08:48:11 crc kubenswrapper[5127]: E0201 08:48:11.387682 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773\": container with ID starting with 039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773 not found: ID does not exist" containerID="039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.387730 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773"} err="failed to get container status \"039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773\": rpc error: code = NotFound desc = could not find container \"039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773\": container with ID starting with 039628464b4203d1703611396a31c0294155f1d57d3d8ba43d9790b1d81da773 not found: ID does not exist" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.387759 5127 scope.go:117] "RemoveContainer" containerID="3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d" Feb 01 08:48:11 crc kubenswrapper[5127]: E0201 08:48:11.388120 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d\": container with ID starting with 3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d not found: ID does not exist" containerID="3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.388156 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d"} err="failed to get container status \"3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d\": rpc error: code = NotFound desc = could not find container \"3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d\": container with ID starting with 3d739eed84d67d2ef2ca68aa353593f78f18a4ff9ef57fd3eaad3e2ce0a5ae8d not found: ID does not exist" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.434027 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ebdbf4d-3544-4f3f-bf77-36e932fc887d-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.609324 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.625798 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.650628 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:48:11 crc kubenswrapper[5127]: E0201 08:48:11.651486 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="probe" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.651514 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="probe" Feb 01 08:48:11 crc kubenswrapper[5127]: E0201 08:48:11.651542 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="cinder-scheduler" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.651551 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="cinder-scheduler" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.652125 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="probe" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.652179 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" containerName="cinder-scheduler" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.661492 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.679381 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.682958 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.739472 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.752850 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.752990 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b091241-0d4c-4126-b7f9-39cd4a145fd9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.753032 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.753066 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.753092 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thf9g\" (UniqueName: \"kubernetes.io/projected/5b091241-0d4c-4126-b7f9-39cd4a145fd9-kube-api-access-thf9g\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.753129 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.855127 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b091241-0d4c-4126-b7f9-39cd4a145fd9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.855181 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.855218 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.855259 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thf9g\" (UniqueName: \"kubernetes.io/projected/5b091241-0d4c-4126-b7f9-39cd4a145fd9-kube-api-access-thf9g\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.855290 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5b091241-0d4c-4126-b7f9-39cd4a145fd9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.855425 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.855567 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.863953 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.864640 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-config-data\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.865069 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.867515 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b091241-0d4c-4126-b7f9-39cd4a145fd9-scripts\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.873296 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thf9g\" (UniqueName: \"kubernetes.io/projected/5b091241-0d4c-4126-b7f9-39cd4a145fd9-kube-api-access-thf9g\") pod \"cinder-scheduler-0\" (UID: \"5b091241-0d4c-4126-b7f9-39cd4a145fd9\") " pod="openstack/cinder-scheduler-0" Feb 01 08:48:11 crc kubenswrapper[5127]: I0201 08:48:11.993323 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 01 08:48:12 crc kubenswrapper[5127]: I0201 08:48:12.246734 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ebdbf4d-3544-4f3f-bf77-36e932fc887d" path="/var/lib/kubelet/pods/6ebdbf4d-3544-4f3f-bf77-36e932fc887d/volumes" Feb 01 08:48:12 crc kubenswrapper[5127]: I0201 08:48:12.447478 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 01 08:48:13 crc kubenswrapper[5127]: I0201 08:48:13.293763 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b091241-0d4c-4126-b7f9-39cd4a145fd9","Type":"ContainerStarted","Data":"f3ae10b9eb5d10c4ccd15707f98d8799b83613a101f14dd8c7a13307ddf6c213"} Feb 01 08:48:13 crc kubenswrapper[5127]: I0201 08:48:13.294070 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b091241-0d4c-4126-b7f9-39cd4a145fd9","Type":"ContainerStarted","Data":"913e7bb26908aa4f994b053321cefba66b492014c2b5406ed81f0803c7a7abd3"} Feb 01 08:48:14 crc kubenswrapper[5127]: I0201 08:48:14.312694 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5b091241-0d4c-4126-b7f9-39cd4a145fd9","Type":"ContainerStarted","Data":"59877d654795e40bfdcc2cce96a3ed38e786d2381138ef9576307b0a14e3ec73"} Feb 01 08:48:14 crc kubenswrapper[5127]: I0201 08:48:14.345340 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.345303941 podStartE2EDuration="3.345303941s" podCreationTimestamp="2026-02-01 08:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:48:14.34121842 +0000 UTC m=+7244.827120803" watchObservedRunningTime="2026-02-01 08:48:14.345303941 +0000 UTC m=+7244.831206304" Feb 01 08:48:16 crc kubenswrapper[5127]: I0201 08:48:16.382171 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 01 08:48:16 crc kubenswrapper[5127]: I0201 08:48:16.994395 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 01 08:48:22 crc kubenswrapper[5127]: I0201 08:48:22.168007 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 01 08:48:43 crc kubenswrapper[5127]: I0201 08:48:43.592214 5127 scope.go:117] "RemoveContainer" containerID="f547792f9611b04d30f30dd56e3f2c7128e1738f7ea4294ca325640d5ced28e0" Feb 01 08:48:43 crc kubenswrapper[5127]: I0201 08:48:43.632184 5127 scope.go:117] "RemoveContainer" containerID="a108f8e2ab869da7ca39c1f51e9bacdb72bb32b915dced6111b292aad07ed53c" Feb 01 08:48:47 crc kubenswrapper[5127]: I0201 08:48:47.064087 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b02d-account-create-update-vnwlv"] Feb 01 08:48:47 crc kubenswrapper[5127]: I0201 08:48:47.078275 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-sqpmb"] Feb 01 08:48:47 crc kubenswrapper[5127]: I0201 08:48:47.087700 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b02d-account-create-update-vnwlv"] Feb 01 08:48:47 crc kubenswrapper[5127]: I0201 08:48:47.096573 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-sqpmb"] Feb 01 08:48:48 crc kubenswrapper[5127]: I0201 08:48:48.247113 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5127f0eb-21e9-4559-b744-57e7ad40df33" path="/var/lib/kubelet/pods/5127f0eb-21e9-4559-b744-57e7ad40df33/volumes" Feb 01 08:48:48 crc kubenswrapper[5127]: I0201 08:48:48.247854 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8142c78-3373-4279-b762-933b0d61711b" path="/var/lib/kubelet/pods/c8142c78-3373-4279-b762-933b0d61711b/volumes" Feb 01 08:49:00 crc kubenswrapper[5127]: I0201 08:49:00.051788 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-m5j5j"] Feb 01 08:49:00 crc kubenswrapper[5127]: I0201 08:49:00.064013 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-m5j5j"] Feb 01 08:49:00 crc kubenswrapper[5127]: I0201 08:49:00.250936 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bbdf245-3582-4862-b9e6-5551b459025b" path="/var/lib/kubelet/pods/8bbdf245-3582-4862-b9e6-5551b459025b/volumes" Feb 01 08:49:13 crc kubenswrapper[5127]: I0201 08:49:13.070527 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tvzdq"] Feb 01 08:49:13 crc kubenswrapper[5127]: I0201 08:49:13.086414 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tvzdq"] Feb 01 08:49:14 crc kubenswrapper[5127]: I0201 08:49:14.253462 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66212b37-2c86-4588-badb-15a7e9b260a6" path="/var/lib/kubelet/pods/66212b37-2c86-4588-badb-15a7e9b260a6/volumes" Feb 01 08:49:14 crc kubenswrapper[5127]: I0201 08:49:14.859653 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mwqb4"] Feb 01 08:49:14 crc kubenswrapper[5127]: I0201 08:49:14.863054 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:14 crc kubenswrapper[5127]: I0201 08:49:14.881151 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwqb4"] Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.023624 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwjn8\" (UniqueName: \"kubernetes.io/projected/69a4d4bc-596e-4036-8719-445c86633455-kube-api-access-nwjn8\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.023704 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-utilities\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.023778 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-catalog-content\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.126599 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-utilities\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.126644 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-catalog-content\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.126759 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwjn8\" (UniqueName: \"kubernetes.io/projected/69a4d4bc-596e-4036-8719-445c86633455-kube-api-access-nwjn8\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.127703 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-utilities\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.127777 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-catalog-content\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.156505 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwjn8\" (UniqueName: \"kubernetes.io/projected/69a4d4bc-596e-4036-8719-445c86633455-kube-api-access-nwjn8\") pod \"community-operators-mwqb4\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.214235 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:15 crc kubenswrapper[5127]: I0201 08:49:15.706971 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mwqb4"] Feb 01 08:49:15 crc kubenswrapper[5127]: W0201 08:49:15.712751 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a4d4bc_596e_4036_8719_445c86633455.slice/crio-35ab89fa80f32b275fff26e9d3bbe5579b731dad52ddc5a50d81d3595f15613a WatchSource:0}: Error finding container 35ab89fa80f32b275fff26e9d3bbe5579b731dad52ddc5a50d81d3595f15613a: Status 404 returned error can't find the container with id 35ab89fa80f32b275fff26e9d3bbe5579b731dad52ddc5a50d81d3595f15613a Feb 01 08:49:16 crc kubenswrapper[5127]: I0201 08:49:16.090946 5127 generic.go:334] "Generic (PLEG): container finished" podID="69a4d4bc-596e-4036-8719-445c86633455" containerID="7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c" exitCode=0 Feb 01 08:49:16 crc kubenswrapper[5127]: I0201 08:49:16.090992 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwqb4" event={"ID":"69a4d4bc-596e-4036-8719-445c86633455","Type":"ContainerDied","Data":"7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c"} Feb 01 08:49:16 crc kubenswrapper[5127]: I0201 08:49:16.091383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwqb4" event={"ID":"69a4d4bc-596e-4036-8719-445c86633455","Type":"ContainerStarted","Data":"35ab89fa80f32b275fff26e9d3bbe5579b731dad52ddc5a50d81d3595f15613a"} Feb 01 08:49:16 crc kubenswrapper[5127]: I0201 08:49:16.094093 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:49:17 crc kubenswrapper[5127]: I0201 08:49:17.109276 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwqb4" event={"ID":"69a4d4bc-596e-4036-8719-445c86633455","Type":"ContainerStarted","Data":"d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5"} Feb 01 08:49:18 crc kubenswrapper[5127]: I0201 08:49:18.124887 5127 generic.go:334] "Generic (PLEG): container finished" podID="69a4d4bc-596e-4036-8719-445c86633455" containerID="d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5" exitCode=0 Feb 01 08:49:18 crc kubenswrapper[5127]: I0201 08:49:18.125365 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwqb4" event={"ID":"69a4d4bc-596e-4036-8719-445c86633455","Type":"ContainerDied","Data":"d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5"} Feb 01 08:49:19 crc kubenswrapper[5127]: I0201 08:49:19.141754 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwqb4" event={"ID":"69a4d4bc-596e-4036-8719-445c86633455","Type":"ContainerStarted","Data":"50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493"} Feb 01 08:49:19 crc kubenswrapper[5127]: I0201 08:49:19.200686 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mwqb4" podStartSLOduration=2.770455048 podStartE2EDuration="5.200655087s" podCreationTimestamp="2026-02-01 08:49:14 +0000 UTC" firstStartedPulling="2026-02-01 08:49:16.093658775 +0000 UTC m=+7306.579561148" lastFinishedPulling="2026-02-01 08:49:18.523858814 +0000 UTC m=+7309.009761187" observedRunningTime="2026-02-01 08:49:19.163298644 +0000 UTC m=+7309.649201007" watchObservedRunningTime="2026-02-01 08:49:19.200655087 +0000 UTC m=+7309.686557460" Feb 01 08:49:25 crc kubenswrapper[5127]: I0201 08:49:25.215042 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:25 crc kubenswrapper[5127]: I0201 08:49:25.215535 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:25 crc kubenswrapper[5127]: I0201 08:49:25.282093 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:26 crc kubenswrapper[5127]: I0201 08:49:26.296811 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:26 crc kubenswrapper[5127]: I0201 08:49:26.370045 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mwqb4"] Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.235861 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mwqb4" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="registry-server" containerID="cri-o://50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493" gracePeriod=2 Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.845719 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.912682 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-utilities\") pod \"69a4d4bc-596e-4036-8719-445c86633455\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.913089 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwjn8\" (UniqueName: \"kubernetes.io/projected/69a4d4bc-596e-4036-8719-445c86633455-kube-api-access-nwjn8\") pod \"69a4d4bc-596e-4036-8719-445c86633455\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.913252 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-catalog-content\") pod \"69a4d4bc-596e-4036-8719-445c86633455\" (UID: \"69a4d4bc-596e-4036-8719-445c86633455\") " Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.913341 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-utilities" (OuterVolumeSpecName: "utilities") pod "69a4d4bc-596e-4036-8719-445c86633455" (UID: "69a4d4bc-596e-4036-8719-445c86633455"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.913775 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.919802 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a4d4bc-596e-4036-8719-445c86633455-kube-api-access-nwjn8" (OuterVolumeSpecName: "kube-api-access-nwjn8") pod "69a4d4bc-596e-4036-8719-445c86633455" (UID: "69a4d4bc-596e-4036-8719-445c86633455"). InnerVolumeSpecName "kube-api-access-nwjn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:49:28 crc kubenswrapper[5127]: I0201 08:49:28.970334 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69a4d4bc-596e-4036-8719-445c86633455" (UID: "69a4d4bc-596e-4036-8719-445c86633455"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.015510 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a4d4bc-596e-4036-8719-445c86633455-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.015893 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwjn8\" (UniqueName: \"kubernetes.io/projected/69a4d4bc-596e-4036-8719-445c86633455-kube-api-access-nwjn8\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.251875 5127 generic.go:334] "Generic (PLEG): container finished" podID="69a4d4bc-596e-4036-8719-445c86633455" containerID="50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493" exitCode=0 Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.251911 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mwqb4" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.251942 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwqb4" event={"ID":"69a4d4bc-596e-4036-8719-445c86633455","Type":"ContainerDied","Data":"50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493"} Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.252001 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mwqb4" event={"ID":"69a4d4bc-596e-4036-8719-445c86633455","Type":"ContainerDied","Data":"35ab89fa80f32b275fff26e9d3bbe5579b731dad52ddc5a50d81d3595f15613a"} Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.252034 5127 scope.go:117] "RemoveContainer" containerID="50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.286171 5127 scope.go:117] "RemoveContainer" containerID="d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.322904 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mwqb4"] Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.339987 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mwqb4"] Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.340728 5127 scope.go:117] "RemoveContainer" containerID="7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.374951 5127 scope.go:117] "RemoveContainer" containerID="50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493" Feb 01 08:49:29 crc kubenswrapper[5127]: E0201 08:49:29.375768 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493\": container with ID starting with 50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493 not found: ID does not exist" containerID="50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.375925 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493"} err="failed to get container status \"50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493\": rpc error: code = NotFound desc = could not find container \"50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493\": container with ID starting with 50c80bd0f3773a325535f8775f31f112c3cf859e361f81bc543fbc76dd14f493 not found: ID does not exist" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.376042 5127 scope.go:117] "RemoveContainer" containerID="d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5" Feb 01 08:49:29 crc kubenswrapper[5127]: E0201 08:49:29.377025 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5\": container with ID starting with d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5 not found: ID does not exist" containerID="d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.377076 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5"} err="failed to get container status \"d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5\": rpc error: code = NotFound desc = could not find container \"d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5\": container with ID starting with d3d1af714da4db5a8bd721539f6095d6e292d5c7352f6428e9d1b2477d4d79b5 not found: ID does not exist" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.377109 5127 scope.go:117] "RemoveContainer" containerID="7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c" Feb 01 08:49:29 crc kubenswrapper[5127]: E0201 08:49:29.377967 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c\": container with ID starting with 7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c not found: ID does not exist" containerID="7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c" Feb 01 08:49:29 crc kubenswrapper[5127]: I0201 08:49:29.378101 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c"} err="failed to get container status \"7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c\": rpc error: code = NotFound desc = could not find container \"7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c\": container with ID starting with 7d4c4a4e55a7e765df3cbbf9fb839b6fafe0082da695f7f3de56153920402f4c not found: ID does not exist" Feb 01 08:49:30 crc kubenswrapper[5127]: I0201 08:49:30.257040 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a4d4bc-596e-4036-8719-445c86633455" path="/var/lib/kubelet/pods/69a4d4bc-596e-4036-8719-445c86633455/volumes" Feb 01 08:49:43 crc kubenswrapper[5127]: I0201 08:49:43.761037 5127 scope.go:117] "RemoveContainer" containerID="b178f58c1b7067de0d8c17ee57538e2eede6ca713a22b4f2a0a1a1df1dd80c21" Feb 01 08:49:43 crc kubenswrapper[5127]: I0201 08:49:43.826988 5127 scope.go:117] "RemoveContainer" containerID="4435ef192242c7aad77f3220df4e309299a7f14399017092aa5e357f780ff509" Feb 01 08:49:43 crc kubenswrapper[5127]: I0201 08:49:43.903139 5127 scope.go:117] "RemoveContainer" containerID="0f988b660e823ed3c10669a713f638558d99d8476f293d13703a52c5d896bc54" Feb 01 08:49:43 crc kubenswrapper[5127]: I0201 08:49:43.958780 5127 scope.go:117] "RemoveContainer" containerID="72ca46e50733b0f6f0afbaabbebaeb930db3c6653e35135bde5f232bc45c05de" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.373850 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-fb665c9d5-9624z"] Feb 01 08:49:55 crc kubenswrapper[5127]: E0201 08:49:55.374849 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="extract-utilities" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.374864 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="extract-utilities" Feb 01 08:49:55 crc kubenswrapper[5127]: E0201 08:49:55.374881 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="extract-content" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.374888 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="extract-content" Feb 01 08:49:55 crc kubenswrapper[5127]: E0201 08:49:55.374899 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="registry-server" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.374905 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="registry-server" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.375107 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a4d4bc-596e-4036-8719-445c86633455" containerName="registry-server" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.376138 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.391773 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.392235 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.392319 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-lqjmp" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.395959 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.408041 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fb665c9d5-9624z"] Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.454239 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.454537 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-log" containerID="cri-o://24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068" gracePeriod=30 Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.454738 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-httpd" containerID="cri-o://d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132" gracePeriod=30 Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.498004 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.498233 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-log" containerID="cri-o://576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601" gracePeriod=30 Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.498391 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-httpd" containerID="cri-o://a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee" gracePeriod=30 Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.522028 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqsdf\" (UniqueName: \"kubernetes.io/projected/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-kube-api-access-dqsdf\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.522159 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-scripts\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.522327 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-config-data\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.522484 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-logs\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.522540 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-horizon-secret-key\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.547454 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b5476f5d5-m5t2w"] Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.549263 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.574939 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b5476f5d5-m5t2w"] Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.594556 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerID="24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068" exitCode=143 Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.594623 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e63579e-7087-46b0-b5ee-ea62558b2b58","Type":"ContainerDied","Data":"24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068"} Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.625233 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-logs\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.625336 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-horizon-secret-key\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.625488 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqsdf\" (UniqueName: \"kubernetes.io/projected/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-kube-api-access-dqsdf\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.625730 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-scripts\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.625759 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-config-data\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.626737 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-logs\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.628887 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-config-data\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.630740 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-scripts\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.642871 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-horizon-secret-key\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.652357 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqsdf\" (UniqueName: \"kubernetes.io/projected/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-kube-api-access-dqsdf\") pod \"horizon-fb665c9d5-9624z\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.711147 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.727786 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67fc15a-ea85-41f4-b0da-f18e073a0932-logs\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.727866 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-scripts\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.727934 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67fc15a-ea85-41f4-b0da-f18e073a0932-horizon-secret-key\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.728000 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-config-data\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.728060 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcdcv\" (UniqueName: \"kubernetes.io/projected/f67fc15a-ea85-41f4-b0da-f18e073a0932-kube-api-access-tcdcv\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.832213 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67fc15a-ea85-41f4-b0da-f18e073a0932-logs\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.832687 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-scripts\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.832717 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67fc15a-ea85-41f4-b0da-f18e073a0932-horizon-secret-key\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.832825 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-config-data\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.832869 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67fc15a-ea85-41f4-b0da-f18e073a0932-logs\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.832925 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcdcv\" (UniqueName: \"kubernetes.io/projected/f67fc15a-ea85-41f4-b0da-f18e073a0932-kube-api-access-tcdcv\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.834895 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-scripts\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.836379 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-config-data\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.840170 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67fc15a-ea85-41f4-b0da-f18e073a0932-horizon-secret-key\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.854089 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcdcv\" (UniqueName: \"kubernetes.io/projected/f67fc15a-ea85-41f4-b0da-f18e073a0932-kube-api-access-tcdcv\") pod \"horizon-7b5476f5d5-m5t2w\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:55 crc kubenswrapper[5127]: I0201 08:49:55.864649 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.137946 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b5476f5d5-m5t2w"] Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.281526 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-568b46bf6c-22znp"] Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.285398 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-fb665c9d5-9624z"] Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.285495 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.344863 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-568b46bf6c-22znp"] Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.431722 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b5476f5d5-m5t2w"] Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.455213 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-horizon-secret-key\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.455384 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-logs\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.455428 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-scripts\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.455449 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwbhn\" (UniqueName: \"kubernetes.io/projected/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-kube-api-access-nwbhn\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.455468 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-config-data\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.557863 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-horizon-secret-key\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.558335 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-logs\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.558387 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-scripts\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.558408 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwbhn\" (UniqueName: \"kubernetes.io/projected/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-kube-api-access-nwbhn\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.558429 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-config-data\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.558800 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-logs\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.559173 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-scripts\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.559614 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-config-data\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.565384 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-horizon-secret-key\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.576955 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwbhn\" (UniqueName: \"kubernetes.io/projected/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-kube-api-access-nwbhn\") pod \"horizon-568b46bf6c-22znp\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.609533 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5476f5d5-m5t2w" event={"ID":"f67fc15a-ea85-41f4-b0da-f18e073a0932","Type":"ContainerStarted","Data":"6490f89eb6b2c5a2b3e19319702721d85b418b7c0ff66166b63442ea75b81b99"} Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.614217 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerID="576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601" exitCode=143 Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.614304 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d3db2ee-c313-45b4-b9e7-a043a619a101","Type":"ContainerDied","Data":"576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601"} Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.616088 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fb665c9d5-9624z" event={"ID":"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9","Type":"ContainerStarted","Data":"f454ae75ca12d2d922eab6b50dbb02122715f639cbffda7bcffcb36df750921c"} Feb 01 08:49:56 crc kubenswrapper[5127]: I0201 08:49:56.646859 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:49:57 crc kubenswrapper[5127]: I0201 08:49:57.184817 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-568b46bf6c-22znp"] Feb 01 08:49:57 crc kubenswrapper[5127]: I0201 08:49:57.629829 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568b46bf6c-22znp" event={"ID":"96e9bbef-b801-4f7e-bc75-57f80f6c06c9","Type":"ContainerStarted","Data":"57abd8d63d1daa719ca98e1deb7da35a1c8851263517bf0bf2e252431c8b3717"} Feb 01 08:49:58 crc kubenswrapper[5127]: I0201 08:49:58.942319 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.54:9292/healthcheck\": dial tcp 10.217.1.54:9292: connect: connection refused" Feb 01 08:49:58 crc kubenswrapper[5127]: I0201 08:49:58.942704 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.54:9292/healthcheck\": dial tcp 10.217.1.54:9292: connect: connection refused" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.359929 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.448376 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.546934 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-httpd-run\") pod \"7e63579e-7087-46b0-b5ee-ea62558b2b58\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.546986 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmfg\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-kube-api-access-dnmfg\") pod \"4d3db2ee-c313-45b4-b9e7-a043a619a101\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547022 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-combined-ca-bundle\") pod \"4d3db2ee-c313-45b4-b9e7-a043a619a101\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547059 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-ceph\") pod \"4d3db2ee-c313-45b4-b9e7-a043a619a101\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547094 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-combined-ca-bundle\") pod \"7e63579e-7087-46b0-b5ee-ea62558b2b58\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547125 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-config-data\") pod \"4d3db2ee-c313-45b4-b9e7-a043a619a101\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547143 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtg87\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-kube-api-access-rtg87\") pod \"7e63579e-7087-46b0-b5ee-ea62558b2b58\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547194 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-scripts\") pod \"7e63579e-7087-46b0-b5ee-ea62558b2b58\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547252 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-config-data\") pod \"7e63579e-7087-46b0-b5ee-ea62558b2b58\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547273 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-httpd-run\") pod \"4d3db2ee-c313-45b4-b9e7-a043a619a101\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547350 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-ceph\") pod \"7e63579e-7087-46b0-b5ee-ea62558b2b58\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547428 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-scripts\") pod \"4d3db2ee-c313-45b4-b9e7-a043a619a101\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547450 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-logs\") pod \"4d3db2ee-c313-45b4-b9e7-a043a619a101\" (UID: \"4d3db2ee-c313-45b4-b9e7-a043a619a101\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.547470 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-logs\") pod \"7e63579e-7087-46b0-b5ee-ea62558b2b58\" (UID: \"7e63579e-7087-46b0-b5ee-ea62558b2b58\") " Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.548464 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-logs" (OuterVolumeSpecName: "logs") pod "7e63579e-7087-46b0-b5ee-ea62558b2b58" (UID: "7e63579e-7087-46b0-b5ee-ea62558b2b58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.549646 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7e63579e-7087-46b0-b5ee-ea62558b2b58" (UID: "7e63579e-7087-46b0-b5ee-ea62558b2b58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.549875 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4d3db2ee-c313-45b4-b9e7-a043a619a101" (UID: "4d3db2ee-c313-45b4-b9e7-a043a619a101"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.550070 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-logs" (OuterVolumeSpecName: "logs") pod "4d3db2ee-c313-45b4-b9e7-a043a619a101" (UID: "4d3db2ee-c313-45b4-b9e7-a043a619a101"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.557543 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-kube-api-access-dnmfg" (OuterVolumeSpecName: "kube-api-access-dnmfg") pod "4d3db2ee-c313-45b4-b9e7-a043a619a101" (UID: "4d3db2ee-c313-45b4-b9e7-a043a619a101"). InnerVolumeSpecName "kube-api-access-dnmfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.558507 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-ceph" (OuterVolumeSpecName: "ceph") pod "4d3db2ee-c313-45b4-b9e7-a043a619a101" (UID: "4d3db2ee-c313-45b4-b9e7-a043a619a101"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.560325 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-scripts" (OuterVolumeSpecName: "scripts") pod "4d3db2ee-c313-45b4-b9e7-a043a619a101" (UID: "4d3db2ee-c313-45b4-b9e7-a043a619a101"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.564007 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-scripts" (OuterVolumeSpecName: "scripts") pod "7e63579e-7087-46b0-b5ee-ea62558b2b58" (UID: "7e63579e-7087-46b0-b5ee-ea62558b2b58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.564072 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-ceph" (OuterVolumeSpecName: "ceph") pod "7e63579e-7087-46b0-b5ee-ea62558b2b58" (UID: "7e63579e-7087-46b0-b5ee-ea62558b2b58"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.567473 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-kube-api-access-rtg87" (OuterVolumeSpecName: "kube-api-access-rtg87") pod "7e63579e-7087-46b0-b5ee-ea62558b2b58" (UID: "7e63579e-7087-46b0-b5ee-ea62558b2b58"). InnerVolumeSpecName "kube-api-access-rtg87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.586805 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d3db2ee-c313-45b4-b9e7-a043a619a101" (UID: "4d3db2ee-c313-45b4-b9e7-a043a619a101"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.589626 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e63579e-7087-46b0-b5ee-ea62558b2b58" (UID: "7e63579e-7087-46b0-b5ee-ea62558b2b58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.633351 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-config-data" (OuterVolumeSpecName: "config-data") pod "7e63579e-7087-46b0-b5ee-ea62558b2b58" (UID: "7e63579e-7087-46b0-b5ee-ea62558b2b58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649734 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649771 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649780 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649788 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649797 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7e63579e-7087-46b0-b5ee-ea62558b2b58-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649809 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmfg\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-kube-api-access-dnmfg\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649818 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649827 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4d3db2ee-c313-45b4-b9e7-a043a619a101-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649837 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649847 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtg87\" (UniqueName: \"kubernetes.io/projected/7e63579e-7087-46b0-b5ee-ea62558b2b58-kube-api-access-rtg87\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649855 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649863 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e63579e-7087-46b0-b5ee-ea62558b2b58-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.649873 5127 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4d3db2ee-c313-45b4-b9e7-a043a619a101-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.659151 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerID="a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee" exitCode=0 Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.659219 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d3db2ee-c313-45b4-b9e7-a043a619a101","Type":"ContainerDied","Data":"a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee"} Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.659296 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4d3db2ee-c313-45b4-b9e7-a043a619a101","Type":"ContainerDied","Data":"7a662a992c1f6d52360885fd24c5948cfb1a244b7019b4399f470cf93d009b71"} Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.659308 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.659315 5127 scope.go:117] "RemoveContainer" containerID="a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.662138 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-config-data" (OuterVolumeSpecName: "config-data") pod "4d3db2ee-c313-45b4-b9e7-a043a619a101" (UID: "4d3db2ee-c313-45b4-b9e7-a043a619a101"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.663428 5127 generic.go:334] "Generic (PLEG): container finished" podID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerID="d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132" exitCode=0 Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.663464 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e63579e-7087-46b0-b5ee-ea62558b2b58","Type":"ContainerDied","Data":"d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132"} Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.663495 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7e63579e-7087-46b0-b5ee-ea62558b2b58","Type":"ContainerDied","Data":"58c1716a09701414054daf5a04cdade9eb6b230f3127d5a0cf93ed3378126ac8"} Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.663562 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.713783 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.720231 5127 scope.go:117] "RemoveContainer" containerID="576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.732621 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.745487 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.746082 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-httpd" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746101 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-httpd" Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.746117 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-log" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746125 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-log" Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.746142 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-httpd" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746148 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-httpd" Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.746170 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-log" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746177 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-log" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746387 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-httpd" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746411 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-httpd" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746429 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" containerName="glance-log" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.746438 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" containerName="glance-log" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.747791 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.751545 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d3db2ee-c313-45b4-b9e7-a043a619a101-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.751787 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.774780 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.814857 5127 scope.go:117] "RemoveContainer" containerID="a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee" Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.815822 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee\": container with ID starting with a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee not found: ID does not exist" containerID="a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.816378 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee"} err="failed to get container status \"a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee\": rpc error: code = NotFound desc = could not find container \"a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee\": container with ID starting with a39c49d57237c936e43701be1900b91abb76894b73e04a0ef6f31996f3b7e8ee not found: ID does not exist" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.816418 5127 scope.go:117] "RemoveContainer" containerID="576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601" Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.817137 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601\": container with ID starting with 576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601 not found: ID does not exist" containerID="576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.817229 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601"} err="failed to get container status \"576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601\": rpc error: code = NotFound desc = could not find container \"576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601\": container with ID starting with 576b0473f3652f7b44c1cfffc2dbc531d5ee5f0daa65d53dbbb0f9f4c86bb601 not found: ID does not exist" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.817258 5127 scope.go:117] "RemoveContainer" containerID="d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.853250 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24109b1d-3e62-4cda-8dfb-8591d4042e6f-logs\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.853341 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24109b1d-3e62-4cda-8dfb-8591d4042e6f-ceph\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.853399 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.853435 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.853465 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2bgm\" (UniqueName: \"kubernetes.io/projected/24109b1d-3e62-4cda-8dfb-8591d4042e6f-kube-api-access-k2bgm\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.853489 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24109b1d-3e62-4cda-8dfb-8591d4042e6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.853510 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.868866 5127 scope.go:117] "RemoveContainer" containerID="24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.898085 5127 scope.go:117] "RemoveContainer" containerID="d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132" Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.898748 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132\": container with ID starting with d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132 not found: ID does not exist" containerID="d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.898799 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132"} err="failed to get container status \"d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132\": rpc error: code = NotFound desc = could not find container \"d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132\": container with ID starting with d09130c65bd42ac898ff271fb741d447357771676cc7a54c5737ef5115d61132 not found: ID does not exist" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.898844 5127 scope.go:117] "RemoveContainer" containerID="24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068" Feb 01 08:49:59 crc kubenswrapper[5127]: E0201 08:49:59.899246 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068\": container with ID starting with 24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068 not found: ID does not exist" containerID="24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.899297 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068"} err="failed to get container status \"24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068\": rpc error: code = NotFound desc = could not find container \"24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068\": container with ID starting with 24d5a76d912e8416cdf23ed70198aa67c76d7ca165a69127cc87027d534dd068 not found: ID does not exist" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.955514 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24109b1d-3e62-4cda-8dfb-8591d4042e6f-ceph\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.955771 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.955873 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.955933 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2bgm\" (UniqueName: \"kubernetes.io/projected/24109b1d-3e62-4cda-8dfb-8591d4042e6f-kube-api-access-k2bgm\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.955972 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24109b1d-3e62-4cda-8dfb-8591d4042e6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.956024 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.957071 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24109b1d-3e62-4cda-8dfb-8591d4042e6f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.959030 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24109b1d-3e62-4cda-8dfb-8591d4042e6f-logs\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.959441 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24109b1d-3e62-4cda-8dfb-8591d4042e6f-logs\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.962700 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.962709 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/24109b1d-3e62-4cda-8dfb-8591d4042e6f-ceph\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.963086 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-scripts\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.967838 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24109b1d-3e62-4cda-8dfb-8591d4042e6f-config-data\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:49:59 crc kubenswrapper[5127]: I0201 08:49:59.974617 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2bgm\" (UniqueName: \"kubernetes.io/projected/24109b1d-3e62-4cda-8dfb-8591d4042e6f-kube-api-access-k2bgm\") pod \"glance-default-external-api-0\" (UID: \"24109b1d-3e62-4cda-8dfb-8591d4042e6f\") " pod="openstack/glance-default-external-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.080532 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.101814 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.115986 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.118383 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.120664 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.125552 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.127466 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.255417 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3db2ee-c313-45b4-b9e7-a043a619a101" path="/var/lib/kubelet/pods/4d3db2ee-c313-45b4-b9e7-a043a619a101/volumes" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.256151 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e63579e-7087-46b0-b5ee-ea62558b2b58" path="/var/lib/kubelet/pods/7e63579e-7087-46b0-b5ee-ea62558b2b58/volumes" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.270092 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5704ab50-5657-44ce-bf06-34a2961cbfb3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.270140 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-292q2\" (UniqueName: \"kubernetes.io/projected/5704ab50-5657-44ce-bf06-34a2961cbfb3-kube-api-access-292q2\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.270190 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5704ab50-5657-44ce-bf06-34a2961cbfb3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.270204 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.270252 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.270277 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.270293 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5704ab50-5657-44ce-bf06-34a2961cbfb3-logs\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.372131 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5704ab50-5657-44ce-bf06-34a2961cbfb3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.372502 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-292q2\" (UniqueName: \"kubernetes.io/projected/5704ab50-5657-44ce-bf06-34a2961cbfb3-kube-api-access-292q2\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.372956 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5704ab50-5657-44ce-bf06-34a2961cbfb3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.373323 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5704ab50-5657-44ce-bf06-34a2961cbfb3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.373342 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.373411 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.373439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.373455 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5704ab50-5657-44ce-bf06-34a2961cbfb3-logs\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.375569 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5704ab50-5657-44ce-bf06-34a2961cbfb3-logs\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.377952 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5704ab50-5657-44ce-bf06-34a2961cbfb3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.378695 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.393186 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.397693 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5704ab50-5657-44ce-bf06-34a2961cbfb3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.398720 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-292q2\" (UniqueName: \"kubernetes.io/projected/5704ab50-5657-44ce-bf06-34a2961cbfb3-kube-api-access-292q2\") pod \"glance-default-internal-api-0\" (UID: \"5704ab50-5657-44ce-bf06-34a2961cbfb3\") " pod="openstack/glance-default-internal-api-0" Feb 01 08:50:00 crc kubenswrapper[5127]: I0201 08:50:00.455122 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:05 crc kubenswrapper[5127]: I0201 08:50:05.696709 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 01 08:50:05 crc kubenswrapper[5127]: W0201 08:50:05.711539 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5704ab50_5657_44ce_bf06_34a2961cbfb3.slice/crio-817fbef40b114f262ac339e41bb33fa63d784e7a1b7ee9d6b734755c9b1fc8d2 WatchSource:0}: Error finding container 817fbef40b114f262ac339e41bb33fa63d784e7a1b7ee9d6b734755c9b1fc8d2: Status 404 returned error can't find the container with id 817fbef40b114f262ac339e41bb33fa63d784e7a1b7ee9d6b734755c9b1fc8d2 Feb 01 08:50:05 crc kubenswrapper[5127]: I0201 08:50:05.764847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5704ab50-5657-44ce-bf06-34a2961cbfb3","Type":"ContainerStarted","Data":"817fbef40b114f262ac339e41bb33fa63d784e7a1b7ee9d6b734755c9b1fc8d2"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.036052 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.783409 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568b46bf6c-22znp" event={"ID":"96e9bbef-b801-4f7e-bc75-57f80f6c06c9","Type":"ContainerStarted","Data":"71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.783795 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568b46bf6c-22znp" event={"ID":"96e9bbef-b801-4f7e-bc75-57f80f6c06c9","Type":"ContainerStarted","Data":"db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.793450 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5704ab50-5657-44ce-bf06-34a2961cbfb3","Type":"ContainerStarted","Data":"f7bdbf5eb0ad3abfdd546b7db1f4c30f0cc9434ac2d4abbdddea55b140b77e7b"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.796034 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24109b1d-3e62-4cda-8dfb-8591d4042e6f","Type":"ContainerStarted","Data":"be7b783ac638226eca0f3aa471ee220544a6fd2e9019a2f8242c042cd3198042"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.796090 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24109b1d-3e62-4cda-8dfb-8591d4042e6f","Type":"ContainerStarted","Data":"c0fa777820f6be1c4cbad59c9cc9ac7e2f13525d97c09072db518b84a3520792"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.801051 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fb665c9d5-9624z" event={"ID":"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9","Type":"ContainerStarted","Data":"73b4fa46f1cde45a75051cc7ded4b36b68c3395d8f12900fd0f2a4213125cf63"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.801111 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fb665c9d5-9624z" event={"ID":"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9","Type":"ContainerStarted","Data":"639357ac8ca8cec96b97ad463f851183e5f62d8907807035bc870d32046a808e"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.813092 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5476f5d5-m5t2w" event={"ID":"f67fc15a-ea85-41f4-b0da-f18e073a0932","Type":"ContainerStarted","Data":"872e5bb62c655b638182627d1580303c7e8f1b618ae8e5bb45e3834c1eff26a6"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.813148 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5476f5d5-m5t2w" event={"ID":"f67fc15a-ea85-41f4-b0da-f18e073a0932","Type":"ContainerStarted","Data":"67dacc74ccd6e500fc3df11ce5fbfa3b38ea8c346923f439ba5cf2a0840b3f97"} Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.813305 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b5476f5d5-m5t2w" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon-log" containerID="cri-o://67dacc74ccd6e500fc3df11ce5fbfa3b38ea8c346923f439ba5cf2a0840b3f97" gracePeriod=30 Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.813560 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b5476f5d5-m5t2w" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon" containerID="cri-o://872e5bb62c655b638182627d1580303c7e8f1b618ae8e5bb45e3834c1eff26a6" gracePeriod=30 Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.819392 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-568b46bf6c-22znp" podStartSLOduration=2.48815058 podStartE2EDuration="10.819361868s" podCreationTimestamp="2026-02-01 08:49:56 +0000 UTC" firstStartedPulling="2026-02-01 08:49:57.204689649 +0000 UTC m=+7347.690592032" lastFinishedPulling="2026-02-01 08:50:05.535900957 +0000 UTC m=+7356.021803320" observedRunningTime="2026-02-01 08:50:06.803511553 +0000 UTC m=+7357.289413956" watchObservedRunningTime="2026-02-01 08:50:06.819361868 +0000 UTC m=+7357.305264231" Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.829504 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-fb665c9d5-9624z" podStartSLOduration=2.5683675 podStartE2EDuration="11.82947972s" podCreationTimestamp="2026-02-01 08:49:55 +0000 UTC" firstStartedPulling="2026-02-01 08:49:56.275171897 +0000 UTC m=+7346.761074260" lastFinishedPulling="2026-02-01 08:50:05.536284117 +0000 UTC m=+7356.022186480" observedRunningTime="2026-02-01 08:50:06.825754639 +0000 UTC m=+7357.311657002" watchObservedRunningTime="2026-02-01 08:50:06.82947972 +0000 UTC m=+7357.315382083" Feb 01 08:50:06 crc kubenswrapper[5127]: I0201 08:50:06.851507 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b5476f5d5-m5t2w" podStartSLOduration=3.018888345 podStartE2EDuration="11.851485912s" podCreationTimestamp="2026-02-01 08:49:55 +0000 UTC" firstStartedPulling="2026-02-01 08:49:56.437237301 +0000 UTC m=+7346.923139664" lastFinishedPulling="2026-02-01 08:50:05.269834868 +0000 UTC m=+7355.755737231" observedRunningTime="2026-02-01 08:50:06.84513243 +0000 UTC m=+7357.331034793" watchObservedRunningTime="2026-02-01 08:50:06.851485912 +0000 UTC m=+7357.337388275" Feb 01 08:50:07 crc kubenswrapper[5127]: I0201 08:50:07.829000 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5704ab50-5657-44ce-bf06-34a2961cbfb3","Type":"ContainerStarted","Data":"a88078d662bf0467c14cafe2c954bb18270826f45dbbacbe07f707e140b572ca"} Feb 01 08:50:07 crc kubenswrapper[5127]: I0201 08:50:07.834178 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24109b1d-3e62-4cda-8dfb-8591d4042e6f","Type":"ContainerStarted","Data":"7f0394c99c3402db74d863e61deb0fdb4bee13cc474fbce9cb1745678381ef31"} Feb 01 08:50:07 crc kubenswrapper[5127]: I0201 08:50:07.865246 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.865217107 podStartE2EDuration="7.865217107s" podCreationTimestamp="2026-02-01 08:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:50:07.850620724 +0000 UTC m=+7358.336523087" watchObservedRunningTime="2026-02-01 08:50:07.865217107 +0000 UTC m=+7358.351119470" Feb 01 08:50:07 crc kubenswrapper[5127]: I0201 08:50:07.900539 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.900516695 podStartE2EDuration="8.900516695s" podCreationTimestamp="2026-02-01 08:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:50:07.891475902 +0000 UTC m=+7358.377378285" watchObservedRunningTime="2026-02-01 08:50:07.900516695 +0000 UTC m=+7358.386419058" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.128249 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.128343 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.167593 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.181916 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.457858 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.457935 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.500994 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.508678 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.866321 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.866374 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.866400 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:10 crc kubenswrapper[5127]: I0201 08:50:10.866427 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:14 crc kubenswrapper[5127]: I0201 08:50:14.076082 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:14 crc kubenswrapper[5127]: I0201 08:50:14.159291 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 08:50:14 crc kubenswrapper[5127]: I0201 08:50:14.159700 5127 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 01 08:50:14 crc kubenswrapper[5127]: I0201 08:50:14.161655 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 01 08:50:15 crc kubenswrapper[5127]: I0201 08:50:15.712267 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:50:15 crc kubenswrapper[5127]: I0201 08:50:15.712649 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:50:15 crc kubenswrapper[5127]: I0201 08:50:15.866845 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:50:16 crc kubenswrapper[5127]: I0201 08:50:16.248310 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 01 08:50:16 crc kubenswrapper[5127]: I0201 08:50:16.647476 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:50:16 crc kubenswrapper[5127]: I0201 08:50:16.647527 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:50:16 crc kubenswrapper[5127]: I0201 08:50:16.650682 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-568b46bf6c-22znp" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Feb 01 08:50:26 crc kubenswrapper[5127]: I0201 08:50:26.649077 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-568b46bf6c-22znp" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Feb 01 08:50:27 crc kubenswrapper[5127]: I0201 08:50:27.642038 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:50:29 crc kubenswrapper[5127]: I0201 08:50:29.276850 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:50:36 crc kubenswrapper[5127]: I0201 08:50:36.740957 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:50:36 crc kubenswrapper[5127]: I0201 08:50:36.741775 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.179968 5127 generic.go:334] "Generic (PLEG): container finished" podID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerID="872e5bb62c655b638182627d1580303c7e8f1b618ae8e5bb45e3834c1eff26a6" exitCode=137 Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.180307 5127 generic.go:334] "Generic (PLEG): container finished" podID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerID="67dacc74ccd6e500fc3df11ce5fbfa3b38ea8c346923f439ba5cf2a0840b3f97" exitCode=137 Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.180023 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5476f5d5-m5t2w" event={"ID":"f67fc15a-ea85-41f4-b0da-f18e073a0932","Type":"ContainerDied","Data":"872e5bb62c655b638182627d1580303c7e8f1b618ae8e5bb45e3834c1eff26a6"} Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.180384 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5476f5d5-m5t2w" event={"ID":"f67fc15a-ea85-41f4-b0da-f18e073a0932","Type":"ContainerDied","Data":"67dacc74ccd6e500fc3df11ce5fbfa3b38ea8c346923f439ba5cf2a0840b3f97"} Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.283930 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.429301 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-config-data\") pod \"f67fc15a-ea85-41f4-b0da-f18e073a0932\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.429369 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-scripts\") pod \"f67fc15a-ea85-41f4-b0da-f18e073a0932\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.429420 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67fc15a-ea85-41f4-b0da-f18e073a0932-logs\") pod \"f67fc15a-ea85-41f4-b0da-f18e073a0932\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.429473 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcdcv\" (UniqueName: \"kubernetes.io/projected/f67fc15a-ea85-41f4-b0da-f18e073a0932-kube-api-access-tcdcv\") pod \"f67fc15a-ea85-41f4-b0da-f18e073a0932\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.429523 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67fc15a-ea85-41f4-b0da-f18e073a0932-horizon-secret-key\") pod \"f67fc15a-ea85-41f4-b0da-f18e073a0932\" (UID: \"f67fc15a-ea85-41f4-b0da-f18e073a0932\") " Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.430480 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f67fc15a-ea85-41f4-b0da-f18e073a0932-logs" (OuterVolumeSpecName: "logs") pod "f67fc15a-ea85-41f4-b0da-f18e073a0932" (UID: "f67fc15a-ea85-41f4-b0da-f18e073a0932"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.440016 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f67fc15a-ea85-41f4-b0da-f18e073a0932-kube-api-access-tcdcv" (OuterVolumeSpecName: "kube-api-access-tcdcv") pod "f67fc15a-ea85-41f4-b0da-f18e073a0932" (UID: "f67fc15a-ea85-41f4-b0da-f18e073a0932"). InnerVolumeSpecName "kube-api-access-tcdcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.441985 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f67fc15a-ea85-41f4-b0da-f18e073a0932-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f67fc15a-ea85-41f4-b0da-f18e073a0932" (UID: "f67fc15a-ea85-41f4-b0da-f18e073a0932"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.455174 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-config-data" (OuterVolumeSpecName: "config-data") pod "f67fc15a-ea85-41f4-b0da-f18e073a0932" (UID: "f67fc15a-ea85-41f4-b0da-f18e073a0932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.487971 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-scripts" (OuterVolumeSpecName: "scripts") pod "f67fc15a-ea85-41f4-b0da-f18e073a0932" (UID: "f67fc15a-ea85-41f4-b0da-f18e073a0932"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.531819 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.532015 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f67fc15a-ea85-41f4-b0da-f18e073a0932-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.532096 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67fc15a-ea85-41f4-b0da-f18e073a0932-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.532162 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcdcv\" (UniqueName: \"kubernetes.io/projected/f67fc15a-ea85-41f4-b0da-f18e073a0932-kube-api-access-tcdcv\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:37 crc kubenswrapper[5127]: I0201 08:50:37.532222 5127 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f67fc15a-ea85-41f4-b0da-f18e073a0932-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.191035 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b5476f5d5-m5t2w" event={"ID":"f67fc15a-ea85-41f4-b0da-f18e073a0932","Type":"ContainerDied","Data":"6490f89eb6b2c5a2b3e19319702721d85b418b7c0ff66166b63442ea75b81b99"} Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.191118 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b5476f5d5-m5t2w" Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.191153 5127 scope.go:117] "RemoveContainer" containerID="872e5bb62c655b638182627d1580303c7e8f1b618ae8e5bb45e3834c1eff26a6" Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.226608 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b5476f5d5-m5t2w"] Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.234952 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b5476f5d5-m5t2w"] Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.248264 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" path="/var/lib/kubelet/pods/f67fc15a-ea85-41f4-b0da-f18e073a0932/volumes" Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.369012 5127 scope.go:117] "RemoveContainer" containerID="67dacc74ccd6e500fc3df11ce5fbfa3b38ea8c346923f439ba5cf2a0840b3f97" Feb 01 08:50:38 crc kubenswrapper[5127]: I0201 08:50:38.453543 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:50:40 crc kubenswrapper[5127]: I0201 08:50:40.186736 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:50:40 crc kubenswrapper[5127]: I0201 08:50:40.334070 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fb665c9d5-9624z"] Feb 01 08:50:40 crc kubenswrapper[5127]: I0201 08:50:40.334318 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fb665c9d5-9624z" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon-log" containerID="cri-o://639357ac8ca8cec96b97ad463f851183e5f62d8907807035bc870d32046a808e" gracePeriod=30 Feb 01 08:50:40 crc kubenswrapper[5127]: I0201 08:50:40.334919 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-fb665c9d5-9624z" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon" containerID="cri-o://73b4fa46f1cde45a75051cc7ded4b36b68c3395d8f12900fd0f2a4213125cf63" gracePeriod=30 Feb 01 08:50:44 crc kubenswrapper[5127]: I0201 08:50:44.290329 5127 generic.go:334] "Generic (PLEG): container finished" podID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerID="73b4fa46f1cde45a75051cc7ded4b36b68c3395d8f12900fd0f2a4213125cf63" exitCode=0 Feb 01 08:50:44 crc kubenswrapper[5127]: I0201 08:50:44.290431 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fb665c9d5-9624z" event={"ID":"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9","Type":"ContainerDied","Data":"73b4fa46f1cde45a75051cc7ded4b36b68c3395d8f12900fd0f2a4213125cf63"} Feb 01 08:50:45 crc kubenswrapper[5127]: I0201 08:50:45.712277 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-fb665c9d5-9624z" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.103:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.103:8080: connect: connection refused" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.782191 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-94fcc5cfc-xwtgm"] Feb 01 08:50:47 crc kubenswrapper[5127]: E0201 08:50:47.782797 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon-log" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.782809 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon-log" Feb 01 08:50:47 crc kubenswrapper[5127]: E0201 08:50:47.782838 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.782843 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.783018 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon-log" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.783029 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f67fc15a-ea85-41f4-b0da-f18e073a0932" containerName="horizon" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.783945 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.798962 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-94fcc5cfc-xwtgm"] Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.946311 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabbf971-66fb-461f-8c55-4531caf0d644-logs\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.946393 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dabbf971-66fb-461f-8c55-4531caf0d644-horizon-secret-key\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.946560 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dabbf971-66fb-461f-8c55-4531caf0d644-config-data\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.946704 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4stn\" (UniqueName: \"kubernetes.io/projected/dabbf971-66fb-461f-8c55-4531caf0d644-kube-api-access-x4stn\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:47 crc kubenswrapper[5127]: I0201 08:50:47.946796 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dabbf971-66fb-461f-8c55-4531caf0d644-scripts\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.048620 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabbf971-66fb-461f-8c55-4531caf0d644-logs\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.048714 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dabbf971-66fb-461f-8c55-4531caf0d644-horizon-secret-key\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.048778 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dabbf971-66fb-461f-8c55-4531caf0d644-config-data\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.048860 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4stn\" (UniqueName: \"kubernetes.io/projected/dabbf971-66fb-461f-8c55-4531caf0d644-kube-api-access-x4stn\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.049228 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabbf971-66fb-461f-8c55-4531caf0d644-logs\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.049267 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dabbf971-66fb-461f-8c55-4531caf0d644-scripts\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.049833 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dabbf971-66fb-461f-8c55-4531caf0d644-scripts\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.049964 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dabbf971-66fb-461f-8c55-4531caf0d644-config-data\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.056531 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dabbf971-66fb-461f-8c55-4531caf0d644-horizon-secret-key\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.065400 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4stn\" (UniqueName: \"kubernetes.io/projected/dabbf971-66fb-461f-8c55-4531caf0d644-kube-api-access-x4stn\") pod \"horizon-94fcc5cfc-xwtgm\" (UID: \"dabbf971-66fb-461f-8c55-4531caf0d644\") " pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.107119 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:48 crc kubenswrapper[5127]: I0201 08:50:48.376675 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-94fcc5cfc-xwtgm"] Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.357905 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94fcc5cfc-xwtgm" event={"ID":"dabbf971-66fb-461f-8c55-4531caf0d644","Type":"ContainerStarted","Data":"d97d686646a25b271b6e50073275e5c0205bda59a608f55833bbc54510b647e7"} Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.358572 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94fcc5cfc-xwtgm" event={"ID":"dabbf971-66fb-461f-8c55-4531caf0d644","Type":"ContainerStarted","Data":"6764229d8c861d89dbfb9f3f3d6f1305a2f59722679f657abfbb4c348bc7f0f4"} Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.358634 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-94fcc5cfc-xwtgm" event={"ID":"dabbf971-66fb-461f-8c55-4531caf0d644","Type":"ContainerStarted","Data":"f15e799a4b7636c2b5c6f3acd3bd85b00273ce69a003a84b5367146bc8aeb67d"} Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.380335 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-94fcc5cfc-xwtgm" podStartSLOduration=2.380318997 podStartE2EDuration="2.380318997s" podCreationTimestamp="2026-02-01 08:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:50:49.376769782 +0000 UTC m=+7399.862672155" watchObservedRunningTime="2026-02-01 08:50:49.380318997 +0000 UTC m=+7399.866221360" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.473946 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-wmmqn"] Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.475116 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.489899 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wmmqn"] Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.559790 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-2dc4-account-create-update-jgplp"] Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.561191 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.563009 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.568760 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2dc4-account-create-update-jgplp"] Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.585106 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b48nh\" (UniqueName: \"kubernetes.io/projected/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-kube-api-access-b48nh\") pod \"heat-db-create-wmmqn\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.585199 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-operator-scripts\") pod \"heat-db-create-wmmqn\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.686406 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkqb\" (UniqueName: \"kubernetes.io/projected/eaacf6b8-b97a-4b64-af63-8488a4b422e2-kube-api-access-hxkqb\") pod \"heat-2dc4-account-create-update-jgplp\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.686463 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-operator-scripts\") pod \"heat-db-create-wmmqn\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.686496 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaacf6b8-b97a-4b64-af63-8488a4b422e2-operator-scripts\") pod \"heat-2dc4-account-create-update-jgplp\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.686639 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b48nh\" (UniqueName: \"kubernetes.io/projected/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-kube-api-access-b48nh\") pod \"heat-db-create-wmmqn\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.687513 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-operator-scripts\") pod \"heat-db-create-wmmqn\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.720256 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b48nh\" (UniqueName: \"kubernetes.io/projected/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-kube-api-access-b48nh\") pod \"heat-db-create-wmmqn\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.788129 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkqb\" (UniqueName: \"kubernetes.io/projected/eaacf6b8-b97a-4b64-af63-8488a4b422e2-kube-api-access-hxkqb\") pod \"heat-2dc4-account-create-update-jgplp\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.788206 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaacf6b8-b97a-4b64-af63-8488a4b422e2-operator-scripts\") pod \"heat-2dc4-account-create-update-jgplp\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.788930 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaacf6b8-b97a-4b64-af63-8488a4b422e2-operator-scripts\") pod \"heat-2dc4-account-create-update-jgplp\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.800971 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.828336 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkqb\" (UniqueName: \"kubernetes.io/projected/eaacf6b8-b97a-4b64-af63-8488a4b422e2-kube-api-access-hxkqb\") pod \"heat-2dc4-account-create-update-jgplp\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:49 crc kubenswrapper[5127]: I0201 08:50:49.880233 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:50 crc kubenswrapper[5127]: I0201 08:50:50.277231 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wmmqn"] Feb 01 08:50:50 crc kubenswrapper[5127]: W0201 08:50:50.278936 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a1cb163_c7c2_48e7_8b6f_645a3aac9f08.slice/crio-53a506502fa8e2fcf4a8b291a53be3e398b33aaa562428a4b39f7d365db68dab WatchSource:0}: Error finding container 53a506502fa8e2fcf4a8b291a53be3e398b33aaa562428a4b39f7d365db68dab: Status 404 returned error can't find the container with id 53a506502fa8e2fcf4a8b291a53be3e398b33aaa562428a4b39f7d365db68dab Feb 01 08:50:50 crc kubenswrapper[5127]: I0201 08:50:50.369853 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2dc4-account-create-update-jgplp"] Feb 01 08:50:50 crc kubenswrapper[5127]: I0201 08:50:50.370143 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wmmqn" event={"ID":"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08","Type":"ContainerStarted","Data":"53a506502fa8e2fcf4a8b291a53be3e398b33aaa562428a4b39f7d365db68dab"} Feb 01 08:50:50 crc kubenswrapper[5127]: W0201 08:50:50.376268 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaacf6b8_b97a_4b64_af63_8488a4b422e2.slice/crio-c6b9d2ca39176b49d8072f6347234f64139eab822becbacd11d8ec03a980cf57 WatchSource:0}: Error finding container c6b9d2ca39176b49d8072f6347234f64139eab822becbacd11d8ec03a980cf57: Status 404 returned error can't find the container with id c6b9d2ca39176b49d8072f6347234f64139eab822becbacd11d8ec03a980cf57 Feb 01 08:50:51 crc kubenswrapper[5127]: I0201 08:50:51.382368 5127 generic.go:334] "Generic (PLEG): container finished" podID="4a1cb163-c7c2-48e7-8b6f-645a3aac9f08" containerID="9a04efd0776c9fd65d0393c1fc10ccc2009d239be2b2bcf29624744258bd48a5" exitCode=0 Feb 01 08:50:51 crc kubenswrapper[5127]: I0201 08:50:51.382441 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wmmqn" event={"ID":"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08","Type":"ContainerDied","Data":"9a04efd0776c9fd65d0393c1fc10ccc2009d239be2b2bcf29624744258bd48a5"} Feb 01 08:50:51 crc kubenswrapper[5127]: I0201 08:50:51.384865 5127 generic.go:334] "Generic (PLEG): container finished" podID="eaacf6b8-b97a-4b64-af63-8488a4b422e2" containerID="dc7726286e755c8d59d912b097f90e645f5d736ad054468d1a9a4c91a4792592" exitCode=0 Feb 01 08:50:51 crc kubenswrapper[5127]: I0201 08:50:51.384907 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2dc4-account-create-update-jgplp" event={"ID":"eaacf6b8-b97a-4b64-af63-8488a4b422e2","Type":"ContainerDied","Data":"dc7726286e755c8d59d912b097f90e645f5d736ad054468d1a9a4c91a4792592"} Feb 01 08:50:51 crc kubenswrapper[5127]: I0201 08:50:51.384931 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2dc4-account-create-update-jgplp" event={"ID":"eaacf6b8-b97a-4b64-af63-8488a4b422e2","Type":"ContainerStarted","Data":"c6b9d2ca39176b49d8072f6347234f64139eab822becbacd11d8ec03a980cf57"} Feb 01 08:50:52 crc kubenswrapper[5127]: I0201 08:50:52.888953 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:52 crc kubenswrapper[5127]: I0201 08:50:52.895569 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.058743 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxkqb\" (UniqueName: \"kubernetes.io/projected/eaacf6b8-b97a-4b64-af63-8488a4b422e2-kube-api-access-hxkqb\") pod \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.058863 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-operator-scripts\") pod \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.058943 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaacf6b8-b97a-4b64-af63-8488a4b422e2-operator-scripts\") pod \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\" (UID: \"eaacf6b8-b97a-4b64-af63-8488a4b422e2\") " Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.058975 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b48nh\" (UniqueName: \"kubernetes.io/projected/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-kube-api-access-b48nh\") pod \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\" (UID: \"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08\") " Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.060213 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a1cb163-c7c2-48e7-8b6f-645a3aac9f08" (UID: "4a1cb163-c7c2-48e7-8b6f-645a3aac9f08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.060211 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaacf6b8-b97a-4b64-af63-8488a4b422e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eaacf6b8-b97a-4b64-af63-8488a4b422e2" (UID: "eaacf6b8-b97a-4b64-af63-8488a4b422e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.064559 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaacf6b8-b97a-4b64-af63-8488a4b422e2-kube-api-access-hxkqb" (OuterVolumeSpecName: "kube-api-access-hxkqb") pod "eaacf6b8-b97a-4b64-af63-8488a4b422e2" (UID: "eaacf6b8-b97a-4b64-af63-8488a4b422e2"). InnerVolumeSpecName "kube-api-access-hxkqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.068371 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-kube-api-access-b48nh" (OuterVolumeSpecName: "kube-api-access-b48nh") pod "4a1cb163-c7c2-48e7-8b6f-645a3aac9f08" (UID: "4a1cb163-c7c2-48e7-8b6f-645a3aac9f08"). InnerVolumeSpecName "kube-api-access-b48nh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.161752 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eaacf6b8-b97a-4b64-af63-8488a4b422e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.161796 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b48nh\" (UniqueName: \"kubernetes.io/projected/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-kube-api-access-b48nh\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.161813 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxkqb\" (UniqueName: \"kubernetes.io/projected/eaacf6b8-b97a-4b64-af63-8488a4b422e2-kube-api-access-hxkqb\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.161826 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.412615 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wmmqn" event={"ID":"4a1cb163-c7c2-48e7-8b6f-645a3aac9f08","Type":"ContainerDied","Data":"53a506502fa8e2fcf4a8b291a53be3e398b33aaa562428a4b39f7d365db68dab"} Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.412658 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a506502fa8e2fcf4a8b291a53be3e398b33aaa562428a4b39f7d365db68dab" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.412655 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wmmqn" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.415006 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2dc4-account-create-update-jgplp" event={"ID":"eaacf6b8-b97a-4b64-af63-8488a4b422e2","Type":"ContainerDied","Data":"c6b9d2ca39176b49d8072f6347234f64139eab822becbacd11d8ec03a980cf57"} Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.415175 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6b9d2ca39176b49d8072f6347234f64139eab822becbacd11d8ec03a980cf57" Feb 01 08:50:53 crc kubenswrapper[5127]: I0201 08:50:53.415078 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2dc4-account-create-update-jgplp" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.780888 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-649mx"] Feb 01 08:50:54 crc kubenswrapper[5127]: E0201 08:50:54.781735 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a1cb163-c7c2-48e7-8b6f-645a3aac9f08" containerName="mariadb-database-create" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.781755 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1cb163-c7c2-48e7-8b6f-645a3aac9f08" containerName="mariadb-database-create" Feb 01 08:50:54 crc kubenswrapper[5127]: E0201 08:50:54.781786 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaacf6b8-b97a-4b64-af63-8488a4b422e2" containerName="mariadb-account-create-update" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.781801 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaacf6b8-b97a-4b64-af63-8488a4b422e2" containerName="mariadb-account-create-update" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.782125 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a1cb163-c7c2-48e7-8b6f-645a3aac9f08" containerName="mariadb-database-create" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.782166 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaacf6b8-b97a-4b64-af63-8488a4b422e2" containerName="mariadb-account-create-update" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.783119 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-649mx" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.786123 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-c7cnn" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.786738 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.805544 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-649mx"] Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.901730 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-config-data\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.901883 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-combined-ca-bundle\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:54 crc kubenswrapper[5127]: I0201 08:50:54.901976 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9f4x\" (UniqueName: \"kubernetes.io/projected/d893f5d8-79e7-4f3f-b16f-779eec683eda-kube-api-access-h9f4x\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.004061 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-config-data\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.004197 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-combined-ca-bundle\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.004282 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9f4x\" (UniqueName: \"kubernetes.io/projected/d893f5d8-79e7-4f3f-b16f-779eec683eda-kube-api-access-h9f4x\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.011591 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-combined-ca-bundle\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.022567 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-config-data\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.024303 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9f4x\" (UniqueName: \"kubernetes.io/projected/d893f5d8-79e7-4f3f-b16f-779eec683eda-kube-api-access-h9f4x\") pod \"heat-db-sync-649mx\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.150666 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-649mx" Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.625878 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-649mx"] Feb 01 08:50:55 crc kubenswrapper[5127]: I0201 08:50:55.712020 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-fb665c9d5-9624z" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.103:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.103:8080: connect: connection refused" Feb 01 08:50:56 crc kubenswrapper[5127]: I0201 08:50:56.452064 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-649mx" event={"ID":"d893f5d8-79e7-4f3f-b16f-779eec683eda","Type":"ContainerStarted","Data":"cff824bc1c8465daf1f61e0f4543c63d6940bfb17af79627a072d439a63423ce"} Feb 01 08:50:58 crc kubenswrapper[5127]: I0201 08:50:58.108722 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:58 crc kubenswrapper[5127]: I0201 08:50:58.109240 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:50:58 crc kubenswrapper[5127]: I0201 08:50:58.110177 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-94fcc5cfc-xwtgm" podUID="dabbf971-66fb-461f-8c55-4531caf0d644" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.108:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.108:8080: connect: connection refused" Feb 01 08:51:04 crc kubenswrapper[5127]: I0201 08:51:04.533127 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-649mx" event={"ID":"d893f5d8-79e7-4f3f-b16f-779eec683eda","Type":"ContainerStarted","Data":"03ef4e34b84e5f0ba021148c2642d16c20dd3c5825c1a1bcb895e91ea0d3671f"} Feb 01 08:51:05 crc kubenswrapper[5127]: I0201 08:51:05.712242 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-fb665c9d5-9624z" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.103:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.103:8080: connect: connection refused" Feb 01 08:51:05 crc kubenswrapper[5127]: I0201 08:51:05.712516 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:51:05 crc kubenswrapper[5127]: I0201 08:51:05.756751 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-649mx" podStartSLOduration=3.549025739 podStartE2EDuration="11.756727168s" podCreationTimestamp="2026-02-01 08:50:54 +0000 UTC" firstStartedPulling="2026-02-01 08:50:55.641652585 +0000 UTC m=+7406.127554948" lastFinishedPulling="2026-02-01 08:51:03.849353984 +0000 UTC m=+7414.335256377" observedRunningTime="2026-02-01 08:51:04.55153724 +0000 UTC m=+7415.037439603" watchObservedRunningTime="2026-02-01 08:51:05.756727168 +0000 UTC m=+7416.242629541" Feb 01 08:51:06 crc kubenswrapper[5127]: I0201 08:51:06.560843 5127 generic.go:334] "Generic (PLEG): container finished" podID="d893f5d8-79e7-4f3f-b16f-779eec683eda" containerID="03ef4e34b84e5f0ba021148c2642d16c20dd3c5825c1a1bcb895e91ea0d3671f" exitCode=0 Feb 01 08:51:06 crc kubenswrapper[5127]: I0201 08:51:06.561125 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-649mx" event={"ID":"d893f5d8-79e7-4f3f-b16f-779eec683eda","Type":"ContainerDied","Data":"03ef4e34b84e5f0ba021148c2642d16c20dd3c5825c1a1bcb895e91ea0d3671f"} Feb 01 08:51:06 crc kubenswrapper[5127]: I0201 08:51:06.740911 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:51:06 crc kubenswrapper[5127]: I0201 08:51:06.741004 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.025629 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-649mx" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.173132 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9f4x\" (UniqueName: \"kubernetes.io/projected/d893f5d8-79e7-4f3f-b16f-779eec683eda-kube-api-access-h9f4x\") pod \"d893f5d8-79e7-4f3f-b16f-779eec683eda\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.173483 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-config-data\") pod \"d893f5d8-79e7-4f3f-b16f-779eec683eda\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.173666 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-combined-ca-bundle\") pod \"d893f5d8-79e7-4f3f-b16f-779eec683eda\" (UID: \"d893f5d8-79e7-4f3f-b16f-779eec683eda\") " Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.182031 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d893f5d8-79e7-4f3f-b16f-779eec683eda-kube-api-access-h9f4x" (OuterVolumeSpecName: "kube-api-access-h9f4x") pod "d893f5d8-79e7-4f3f-b16f-779eec683eda" (UID: "d893f5d8-79e7-4f3f-b16f-779eec683eda"). InnerVolumeSpecName "kube-api-access-h9f4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.222494 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d893f5d8-79e7-4f3f-b16f-779eec683eda" (UID: "d893f5d8-79e7-4f3f-b16f-779eec683eda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.252294 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-config-data" (OuterVolumeSpecName: "config-data") pod "d893f5d8-79e7-4f3f-b16f-779eec683eda" (UID: "d893f5d8-79e7-4f3f-b16f-779eec683eda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.275800 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9f4x\" (UniqueName: \"kubernetes.io/projected/d893f5d8-79e7-4f3f-b16f-779eec683eda-kube-api-access-h9f4x\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.275845 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.275863 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d893f5d8-79e7-4f3f-b16f-779eec683eda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.591190 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-649mx" event={"ID":"d893f5d8-79e7-4f3f-b16f-779eec683eda","Type":"ContainerDied","Data":"cff824bc1c8465daf1f61e0f4543c63d6940bfb17af79627a072d439a63423ce"} Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.591267 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff824bc1c8465daf1f61e0f4543c63d6940bfb17af79627a072d439a63423ce" Feb 01 08:51:08 crc kubenswrapper[5127]: I0201 08:51:08.591360 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-649mx" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.683955 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-666985764f-pkz2m"] Feb 01 08:51:09 crc kubenswrapper[5127]: E0201 08:51:09.684754 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d893f5d8-79e7-4f3f-b16f-779eec683eda" containerName="heat-db-sync" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.684767 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d893f5d8-79e7-4f3f-b16f-779eec683eda" containerName="heat-db-sync" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.687365 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d893f5d8-79e7-4f3f-b16f-779eec683eda" containerName="heat-db-sync" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.687998 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.695388 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.695602 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.695828 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-c7cnn" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.703001 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-666985764f-pkz2m"] Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.811103 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-79964bdd45-v7lmb"] Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.819971 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.821771 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-config-data\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.822050 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-combined-ca-bundle\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.822136 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-config-data-custom\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.822195 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s69n8\" (UniqueName: \"kubernetes.io/projected/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-kube-api-access-s69n8\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.823947 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.849360 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79964bdd45-v7lmb"] Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.878637 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6bf65bd6f7-jsgn5"] Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.879951 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.893058 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.907689 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6bf65bd6f7-jsgn5"] Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924082 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-combined-ca-bundle\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924193 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-config-data-custom\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924280 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s69n8\" (UniqueName: \"kubernetes.io/projected/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-kube-api-access-s69n8\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924342 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-config-data\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924396 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-config-data-custom\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924525 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-config-data\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924554 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tljzh\" (UniqueName: \"kubernetes.io/projected/7b587dac-ef38-4834-ae3e-16b2cde5219a-kube-api-access-tljzh\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.924604 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-combined-ca-bundle\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.928567 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-combined-ca-bundle\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.931922 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-config-data-custom\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.940684 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-config-data\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:09 crc kubenswrapper[5127]: I0201 08:51:09.949632 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s69n8\" (UniqueName: \"kubernetes.io/projected/0a8e11d4-1e74-4618-a7a2-88e646e3d80d-kube-api-access-s69n8\") pod \"heat-engine-666985764f-pkz2m\" (UID: \"0a8e11d4-1e74-4618-a7a2-88e646e3d80d\") " pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.002516 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.022476 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.033411 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-combined-ca-bundle\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.033516 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-config-data\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.033548 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tljzh\" (UniqueName: \"kubernetes.io/projected/7b587dac-ef38-4834-ae3e-16b2cde5219a-kube-api-access-tljzh\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.033604 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-combined-ca-bundle\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.033842 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-config-data-custom\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.034025 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-config-data\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.034060 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-config-data-custom\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.034084 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22g4f\" (UniqueName: \"kubernetes.io/projected/09e2ac26-a2d8-42f4-b58b-33adb0156755-kube-api-access-22g4f\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.048574 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-combined-ca-bundle\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.061887 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tljzh\" (UniqueName: \"kubernetes.io/projected/7b587dac-ef38-4834-ae3e-16b2cde5219a-kube-api-access-tljzh\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.061937 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-config-data\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.064733 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b587dac-ef38-4834-ae3e-16b2cde5219a-config-data-custom\") pod \"heat-api-79964bdd45-v7lmb\" (UID: \"7b587dac-ef38-4834-ae3e-16b2cde5219a\") " pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.136699 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-config-data-custom\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.136773 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-config-data\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.136796 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22g4f\" (UniqueName: \"kubernetes.io/projected/09e2ac26-a2d8-42f4-b58b-33adb0156755-kube-api-access-22g4f\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.136855 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-combined-ca-bundle\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.145944 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-combined-ca-bundle\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.148754 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-config-data-custom\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.160003 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e2ac26-a2d8-42f4-b58b-33adb0156755-config-data\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.163681 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22g4f\" (UniqueName: \"kubernetes.io/projected/09e2ac26-a2d8-42f4-b58b-33adb0156755-kube-api-access-22g4f\") pod \"heat-cfnapi-6bf65bd6f7-jsgn5\" (UID: \"09e2ac26-a2d8-42f4-b58b-33adb0156755\") " pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.202346 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.213998 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.520324 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-666985764f-pkz2m"] Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.625361 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-666985764f-pkz2m" event={"ID":"0a8e11d4-1e74-4618-a7a2-88e646e3d80d","Type":"ContainerStarted","Data":"3385f6faedae695e98379d2963f0c6429bdec198f3bec981524857629c52ca34"} Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.643095 5127 generic.go:334] "Generic (PLEG): container finished" podID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerID="639357ac8ca8cec96b97ad463f851183e5f62d8907807035bc870d32046a808e" exitCode=137 Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.643146 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fb665c9d5-9624z" event={"ID":"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9","Type":"ContainerDied","Data":"639357ac8ca8cec96b97ad463f851183e5f62d8907807035bc870d32046a808e"} Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.728016 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-79964bdd45-v7lmb"] Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.781375 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.861997 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6bf65bd6f7-jsgn5"] Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.960695 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-scripts\") pod \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.960910 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-config-data\") pod \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.960976 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-logs\") pod \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.961025 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqsdf\" (UniqueName: \"kubernetes.io/projected/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-kube-api-access-dqsdf\") pod \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.961059 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-horizon-secret-key\") pod \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\" (UID: \"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9\") " Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.961566 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-logs" (OuterVolumeSpecName: "logs") pod "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" (UID: "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.962539 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.965510 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" (UID: "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.965810 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-kube-api-access-dqsdf" (OuterVolumeSpecName: "kube-api-access-dqsdf") pod "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" (UID: "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9"). InnerVolumeSpecName "kube-api-access-dqsdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:51:10 crc kubenswrapper[5127]: I0201 08:51:10.990416 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-config-data" (OuterVolumeSpecName: "config-data") pod "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" (UID: "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.004467 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-scripts" (OuterVolumeSpecName: "scripts") pod "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" (UID: "5afbf678-9f8e-4c9c-b87e-8776fdc77ac9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.063933 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.063966 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqsdf\" (UniqueName: \"kubernetes.io/projected/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-kube-api-access-dqsdf\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.063977 5127 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.063988 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.659224 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-666985764f-pkz2m" event={"ID":"0a8e11d4-1e74-4618-a7a2-88e646e3d80d","Type":"ContainerStarted","Data":"7af02d36ad430476a29f2ae565190f9959ea3eeacd8a3e0cfc6dad6211c71eb0"} Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.659620 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.666445 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79964bdd45-v7lmb" event={"ID":"7b587dac-ef38-4834-ae3e-16b2cde5219a","Type":"ContainerStarted","Data":"f53daf6b8e0c9801fa6c4ef15fd19b7eb5567c85370366021128237cbb41182f"} Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.674088 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" event={"ID":"09e2ac26-a2d8-42f4-b58b-33adb0156755","Type":"ContainerStarted","Data":"17228832e2ea850286e2d72613ce16794de807e9216e331c416a7c30e617ecdc"} Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.686373 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-fb665c9d5-9624z" event={"ID":"5afbf678-9f8e-4c9c-b87e-8776fdc77ac9","Type":"ContainerDied","Data":"f454ae75ca12d2d922eab6b50dbb02122715f639cbffda7bcffcb36df750921c"} Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.686424 5127 scope.go:117] "RemoveContainer" containerID="73b4fa46f1cde45a75051cc7ded4b36b68c3395d8f12900fd0f2a4213125cf63" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.686555 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-fb665c9d5-9624z" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.702206 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-666985764f-pkz2m" podStartSLOduration=2.702187679 podStartE2EDuration="2.702187679s" podCreationTimestamp="2026-02-01 08:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:51:11.680914258 +0000 UTC m=+7422.166816631" watchObservedRunningTime="2026-02-01 08:51:11.702187679 +0000 UTC m=+7422.188090042" Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.742331 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-fb665c9d5-9624z"] Feb 01 08:51:11 crc kubenswrapper[5127]: I0201 08:51:11.753182 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-fb665c9d5-9624z"] Feb 01 08:51:12 crc kubenswrapper[5127]: I0201 08:51:12.186786 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-94fcc5cfc-xwtgm" Feb 01 08:51:12 crc kubenswrapper[5127]: I0201 08:51:12.250988 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" path="/var/lib/kubelet/pods/5afbf678-9f8e-4c9c-b87e-8776fdc77ac9/volumes" Feb 01 08:51:12 crc kubenswrapper[5127]: I0201 08:51:12.251679 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568b46bf6c-22znp"] Feb 01 08:51:12 crc kubenswrapper[5127]: I0201 08:51:12.251894 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-568b46bf6c-22znp" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon-log" containerID="cri-o://db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746" gracePeriod=30 Feb 01 08:51:12 crc kubenswrapper[5127]: I0201 08:51:12.252231 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-568b46bf6c-22znp" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" containerID="cri-o://71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415" gracePeriod=30 Feb 01 08:51:12 crc kubenswrapper[5127]: I0201 08:51:12.401104 5127 scope.go:117] "RemoveContainer" containerID="639357ac8ca8cec96b97ad463f851183e5f62d8907807035bc870d32046a808e" Feb 01 08:51:13 crc kubenswrapper[5127]: I0201 08:51:13.722954 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-79964bdd45-v7lmb" event={"ID":"7b587dac-ef38-4834-ae3e-16b2cde5219a","Type":"ContainerStarted","Data":"8ecf4b58d8498c103652adc7de5b2c7167007136ae787e25055e40de14e792f2"} Feb 01 08:51:13 crc kubenswrapper[5127]: I0201 08:51:13.723235 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:13 crc kubenswrapper[5127]: I0201 08:51:13.726107 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" event={"ID":"09e2ac26-a2d8-42f4-b58b-33adb0156755","Type":"ContainerStarted","Data":"19a6a63d1fe9032a1b16eedfa45cb1d27450483f73c6a009360624833e80664e"} Feb 01 08:51:13 crc kubenswrapper[5127]: I0201 08:51:13.726390 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:13 crc kubenswrapper[5127]: I0201 08:51:13.750326 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-79964bdd45-v7lmb" podStartSLOduration=2.995092928 podStartE2EDuration="4.750304964s" podCreationTimestamp="2026-02-01 08:51:09 +0000 UTC" firstStartedPulling="2026-02-01 08:51:10.744178511 +0000 UTC m=+7421.230080874" lastFinishedPulling="2026-02-01 08:51:12.499390547 +0000 UTC m=+7422.985292910" observedRunningTime="2026-02-01 08:51:13.743615095 +0000 UTC m=+7424.229517478" watchObservedRunningTime="2026-02-01 08:51:13.750304964 +0000 UTC m=+7424.236207337" Feb 01 08:51:13 crc kubenswrapper[5127]: I0201 08:51:13.778276 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" podStartSLOduration=3.140433633 podStartE2EDuration="4.778254445s" podCreationTimestamp="2026-02-01 08:51:09 +0000 UTC" firstStartedPulling="2026-02-01 08:51:10.863167118 +0000 UTC m=+7421.349069481" lastFinishedPulling="2026-02-01 08:51:12.50098793 +0000 UTC m=+7422.986890293" observedRunningTime="2026-02-01 08:51:13.769103039 +0000 UTC m=+7424.255005412" watchObservedRunningTime="2026-02-01 08:51:13.778254445 +0000 UTC m=+7424.264156818" Feb 01 08:51:16 crc kubenswrapper[5127]: I0201 08:51:16.647822 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-568b46bf6c-22znp" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Feb 01 08:51:16 crc kubenswrapper[5127]: I0201 08:51:16.751970 5127 generic.go:334] "Generic (PLEG): container finished" podID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerID="71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415" exitCode=0 Feb 01 08:51:16 crc kubenswrapper[5127]: I0201 08:51:16.752012 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568b46bf6c-22znp" event={"ID":"96e9bbef-b801-4f7e-bc75-57f80f6c06c9","Type":"ContainerDied","Data":"71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415"} Feb 01 08:51:20 crc kubenswrapper[5127]: I0201 08:51:20.062151 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-666985764f-pkz2m" Feb 01 08:51:21 crc kubenswrapper[5127]: I0201 08:51:21.967073 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6bf65bd6f7-jsgn5" Feb 01 08:51:21 crc kubenswrapper[5127]: I0201 08:51:21.997857 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-79964bdd45-v7lmb" Feb 01 08:51:26 crc kubenswrapper[5127]: I0201 08:51:26.648470 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-568b46bf6c-22znp" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Feb 01 08:51:33 crc kubenswrapper[5127]: I0201 08:51:33.065685 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-w7x6q"] Feb 01 08:51:33 crc kubenswrapper[5127]: I0201 08:51:33.075177 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-60e9-account-create-update-jjsf4"] Feb 01 08:51:33 crc kubenswrapper[5127]: I0201 08:51:33.086886 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-w7x6q"] Feb 01 08:51:33 crc kubenswrapper[5127]: I0201 08:51:33.098927 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-60e9-account-create-update-jjsf4"] Feb 01 08:51:34 crc kubenswrapper[5127]: I0201 08:51:34.256571 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9289cd0d-84e3-4e64-b58b-abb7cd491d78" path="/var/lib/kubelet/pods/9289cd0d-84e3-4e64-b58b-abb7cd491d78/volumes" Feb 01 08:51:34 crc kubenswrapper[5127]: I0201 08:51:34.258139 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2e5f2be-7990-4dc6-920e-3dcdf7247424" path="/var/lib/kubelet/pods/f2e5f2be-7990-4dc6-920e-3dcdf7247424/volumes" Feb 01 08:51:36 crc kubenswrapper[5127]: I0201 08:51:36.649312 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-568b46bf6c-22znp" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.105:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.105:8080: connect: connection refused" Feb 01 08:51:36 crc kubenswrapper[5127]: I0201 08:51:36.649912 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:51:36 crc kubenswrapper[5127]: I0201 08:51:36.741148 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:51:36 crc kubenswrapper[5127]: I0201 08:51:36.741247 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:51:36 crc kubenswrapper[5127]: I0201 08:51:36.741311 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 08:51:36 crc kubenswrapper[5127]: I0201 08:51:36.742281 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 08:51:36 crc kubenswrapper[5127]: I0201 08:51:36.742392 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" gracePeriod=600 Feb 01 08:51:36 crc kubenswrapper[5127]: E0201 08:51:36.873764 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:51:37 crc kubenswrapper[5127]: I0201 08:51:37.003314 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" exitCode=0 Feb 01 08:51:37 crc kubenswrapper[5127]: I0201 08:51:37.003383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf"} Feb 01 08:51:37 crc kubenswrapper[5127]: I0201 08:51:37.003483 5127 scope.go:117] "RemoveContainer" containerID="b64404fcebafdcf797cccad51367be97e4707a6d126b6520c7afde58af417411" Feb 01 08:51:37 crc kubenswrapper[5127]: I0201 08:51:37.004629 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:51:37 crc kubenswrapper[5127]: E0201 08:51:37.005142 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.919502 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd"] Feb 01 08:51:39 crc kubenswrapper[5127]: E0201 08:51:39.921983 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon-log" Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.922155 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon-log" Feb 01 08:51:39 crc kubenswrapper[5127]: E0201 08:51:39.922293 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon" Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.922389 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon" Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.922849 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon" Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.923003 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5afbf678-9f8e-4c9c-b87e-8776fdc77ac9" containerName="horizon-log" Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.924843 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.930485 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd"] Feb 01 08:51:39 crc kubenswrapper[5127]: I0201 08:51:39.933651 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.046811 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2d6p\" (UniqueName: \"kubernetes.io/projected/e9aacae9-b0d2-4661-8e56-52e562125b03-kube-api-access-n2d6p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.046964 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.047054 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.148795 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.148903 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2d6p\" (UniqueName: \"kubernetes.io/projected/e9aacae9-b0d2-4661-8e56-52e562125b03-kube-api-access-n2d6p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.149032 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.149513 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.149784 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.183525 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2d6p\" (UniqueName: \"kubernetes.io/projected/e9aacae9-b0d2-4661-8e56-52e562125b03-kube-api-access-n2d6p\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.266035 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:40 crc kubenswrapper[5127]: I0201 08:51:40.740388 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd"] Feb 01 08:51:41 crc kubenswrapper[5127]: I0201 08:51:41.064022 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" event={"ID":"e9aacae9-b0d2-4661-8e56-52e562125b03","Type":"ContainerStarted","Data":"8f5f3e92d2b918bf13d06cc3069ebaeecd91cd5ca3cc64ff327c07118aadfab6"} Feb 01 08:51:41 crc kubenswrapper[5127]: I0201 08:51:41.064523 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" event={"ID":"e9aacae9-b0d2-4661-8e56-52e562125b03","Type":"ContainerStarted","Data":"abf015943ded2a0ba1b672bb28cf3856820e02329fa0f595acb02b74f6842110"} Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.072869 5127 generic.go:334] "Generic (PLEG): container finished" podID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerID="8f5f3e92d2b918bf13d06cc3069ebaeecd91cd5ca3cc64ff327c07118aadfab6" exitCode=0 Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.072956 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" event={"ID":"e9aacae9-b0d2-4661-8e56-52e562125b03","Type":"ContainerDied","Data":"8f5f3e92d2b918bf13d06cc3069ebaeecd91cd5ca3cc64ff327c07118aadfab6"} Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.218830 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bdsbn"] Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.221774 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.266640 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdsbn"] Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.311190 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-utilities\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.311428 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdxzn\" (UniqueName: \"kubernetes.io/projected/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-kube-api-access-hdxzn\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.311627 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-catalog-content\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.413424 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdxzn\" (UniqueName: \"kubernetes.io/projected/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-kube-api-access-hdxzn\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.413551 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-catalog-content\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.413706 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-utilities\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.414568 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-catalog-content\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.414683 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-utilities\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.438773 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdxzn\" (UniqueName: \"kubernetes.io/projected/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-kube-api-access-hdxzn\") pod \"redhat-operators-bdsbn\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.549896 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.781797 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.931506 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-config-data\") pod \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.931596 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-horizon-secret-key\") pod \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.931643 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwbhn\" (UniqueName: \"kubernetes.io/projected/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-kube-api-access-nwbhn\") pod \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.931701 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-logs\") pod \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.931729 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-scripts\") pod \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\" (UID: \"96e9bbef-b801-4f7e-bc75-57f80f6c06c9\") " Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.941204 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-logs" (OuterVolumeSpecName: "logs") pod "96e9bbef-b801-4f7e-bc75-57f80f6c06c9" (UID: "96e9bbef-b801-4f7e-bc75-57f80f6c06c9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.970742 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "96e9bbef-b801-4f7e-bc75-57f80f6c06c9" (UID: "96e9bbef-b801-4f7e-bc75-57f80f6c06c9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.978415 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-config-data" (OuterVolumeSpecName: "config-data") pod "96e9bbef-b801-4f7e-bc75-57f80f6c06c9" (UID: "96e9bbef-b801-4f7e-bc75-57f80f6c06c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:51:42 crc kubenswrapper[5127]: I0201 08:51:42.979876 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-kube-api-access-nwbhn" (OuterVolumeSpecName: "kube-api-access-nwbhn") pod "96e9bbef-b801-4f7e-bc75-57f80f6c06c9" (UID: "96e9bbef-b801-4f7e-bc75-57f80f6c06c9"). InnerVolumeSpecName "kube-api-access-nwbhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.023193 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-scripts" (OuterVolumeSpecName: "scripts") pod "96e9bbef-b801-4f7e-bc75-57f80f6c06c9" (UID: "96e9bbef-b801-4f7e-bc75-57f80f6c06c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.036657 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.036692 5127 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.036708 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwbhn\" (UniqueName: \"kubernetes.io/projected/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-kube-api-access-nwbhn\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.036722 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-logs\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.036735 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96e9bbef-b801-4f7e-bc75-57f80f6c06c9-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.096186 5127 generic.go:334] "Generic (PLEG): container finished" podID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerID="db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746" exitCode=137 Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.096237 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568b46bf6c-22znp" event={"ID":"96e9bbef-b801-4f7e-bc75-57f80f6c06c9","Type":"ContainerDied","Data":"db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746"} Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.096271 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-568b46bf6c-22znp" event={"ID":"96e9bbef-b801-4f7e-bc75-57f80f6c06c9","Type":"ContainerDied","Data":"57abd8d63d1daa719ca98e1deb7da35a1c8851263517bf0bf2e252431c8b3717"} Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.096290 5127 scope.go:117] "RemoveContainer" containerID="71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.096340 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-568b46bf6c-22znp" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.144128 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-568b46bf6c-22znp"] Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.153539 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-568b46bf6c-22znp"] Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.207002 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdsbn"] Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.311861 5127 scope.go:117] "RemoveContainer" containerID="db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746" Feb 01 08:51:43 crc kubenswrapper[5127]: W0201 08:51:43.317336 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76dc6ff2_adb5_4c24_8f5e_433040a4d4b0.slice/crio-614fdaec4f9d0c93733849017decfc2a6b7882b8c5ec9d7117dcde03ff559125 WatchSource:0}: Error finding container 614fdaec4f9d0c93733849017decfc2a6b7882b8c5ec9d7117dcde03ff559125: Status 404 returned error can't find the container with id 614fdaec4f9d0c93733849017decfc2a6b7882b8c5ec9d7117dcde03ff559125 Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.490772 5127 scope.go:117] "RemoveContainer" containerID="71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415" Feb 01 08:51:43 crc kubenswrapper[5127]: E0201 08:51:43.491229 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415\": container with ID starting with 71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415 not found: ID does not exist" containerID="71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.491257 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415"} err="failed to get container status \"71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415\": rpc error: code = NotFound desc = could not find container \"71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415\": container with ID starting with 71d88e971a73f1f83dd8b261a49cbe6790ef436986632bc7b7c6eabf2ff1f415 not found: ID does not exist" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.491278 5127 scope.go:117] "RemoveContainer" containerID="db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746" Feb 01 08:51:43 crc kubenswrapper[5127]: E0201 08:51:43.491765 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746\": container with ID starting with db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746 not found: ID does not exist" containerID="db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746" Feb 01 08:51:43 crc kubenswrapper[5127]: I0201 08:51:43.491788 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746"} err="failed to get container status \"db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746\": rpc error: code = NotFound desc = could not find container \"db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746\": container with ID starting with db302b0e13637e4e8ff5488bba49be4b1a5de0ffccfcbffe2ff74584bd443746 not found: ID does not exist" Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.106900 5127 generic.go:334] "Generic (PLEG): container finished" podID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerID="0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284" exitCode=0 Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.107052 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdsbn" event={"ID":"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0","Type":"ContainerDied","Data":"0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284"} Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.109731 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdsbn" event={"ID":"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0","Type":"ContainerStarted","Data":"614fdaec4f9d0c93733849017decfc2a6b7882b8c5ec9d7117dcde03ff559125"} Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.117519 5127 generic.go:334] "Generic (PLEG): container finished" podID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerID="99cb8fc6f3eb25b79c283d276c867cd9dcb230772e0d4c85035f6784bdd40bc0" exitCode=0 Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.117565 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" event={"ID":"e9aacae9-b0d2-4661-8e56-52e562125b03","Type":"ContainerDied","Data":"99cb8fc6f3eb25b79c283d276c867cd9dcb230772e0d4c85035f6784bdd40bc0"} Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.245894 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" path="/var/lib/kubelet/pods/96e9bbef-b801-4f7e-bc75-57f80f6c06c9/volumes" Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.280600 5127 scope.go:117] "RemoveContainer" containerID="d4a2e6e80a65fef990ee04fb2cccb3997b4713bf2233a615d3dfdc78de7479c9" Feb 01 08:51:44 crc kubenswrapper[5127]: I0201 08:51:44.307450 5127 scope.go:117] "RemoveContainer" containerID="88a134c2b0427d1c9fae08dff4980472d8a5a5771e178bb51d03b8ca606c2efd" Feb 01 08:51:45 crc kubenswrapper[5127]: I0201 08:51:45.144254 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdsbn" event={"ID":"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0","Type":"ContainerStarted","Data":"ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33"} Feb 01 08:51:45 crc kubenswrapper[5127]: I0201 08:51:45.158388 5127 generic.go:334] "Generic (PLEG): container finished" podID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerID="fb0053b9adea34227f7507dc48a102d34e77bc9c3c581697f50c3b2f9d9d4544" exitCode=0 Feb 01 08:51:45 crc kubenswrapper[5127]: I0201 08:51:45.158463 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" event={"ID":"e9aacae9-b0d2-4661-8e56-52e562125b03","Type":"ContainerDied","Data":"fb0053b9adea34227f7507dc48a102d34e77bc9c3c581697f50c3b2f9d9d4544"} Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.068930 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xvm96"] Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.081914 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xvm96"] Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.248508 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d99c6eb-df9a-4205-9674-695a49e6c720" path="/var/lib/kubelet/pods/8d99c6eb-df9a-4205-9674-695a49e6c720/volumes" Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.706852 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.811492 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-bundle\") pod \"e9aacae9-b0d2-4661-8e56-52e562125b03\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.811755 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-util\") pod \"e9aacae9-b0d2-4661-8e56-52e562125b03\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.811888 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2d6p\" (UniqueName: \"kubernetes.io/projected/e9aacae9-b0d2-4661-8e56-52e562125b03-kube-api-access-n2d6p\") pod \"e9aacae9-b0d2-4661-8e56-52e562125b03\" (UID: \"e9aacae9-b0d2-4661-8e56-52e562125b03\") " Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.813331 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-bundle" (OuterVolumeSpecName: "bundle") pod "e9aacae9-b0d2-4661-8e56-52e562125b03" (UID: "e9aacae9-b0d2-4661-8e56-52e562125b03"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.818419 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9aacae9-b0d2-4661-8e56-52e562125b03-kube-api-access-n2d6p" (OuterVolumeSpecName: "kube-api-access-n2d6p") pod "e9aacae9-b0d2-4661-8e56-52e562125b03" (UID: "e9aacae9-b0d2-4661-8e56-52e562125b03"). InnerVolumeSpecName "kube-api-access-n2d6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.822560 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-util" (OuterVolumeSpecName: "util") pod "e9aacae9-b0d2-4661-8e56-52e562125b03" (UID: "e9aacae9-b0d2-4661-8e56-52e562125b03"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.915751 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2d6p\" (UniqueName: \"kubernetes.io/projected/e9aacae9-b0d2-4661-8e56-52e562125b03-kube-api-access-n2d6p\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.915836 5127 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:46 crc kubenswrapper[5127]: I0201 08:51:46.915855 5127 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e9aacae9-b0d2-4661-8e56-52e562125b03-util\") on node \"crc\" DevicePath \"\"" Feb 01 08:51:47 crc kubenswrapper[5127]: I0201 08:51:47.181503 5127 generic.go:334] "Generic (PLEG): container finished" podID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerID="ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33" exitCode=0 Feb 01 08:51:47 crc kubenswrapper[5127]: I0201 08:51:47.181575 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdsbn" event={"ID":"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0","Type":"ContainerDied","Data":"ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33"} Feb 01 08:51:47 crc kubenswrapper[5127]: I0201 08:51:47.185986 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" event={"ID":"e9aacae9-b0d2-4661-8e56-52e562125b03","Type":"ContainerDied","Data":"abf015943ded2a0ba1b672bb28cf3856820e02329fa0f595acb02b74f6842110"} Feb 01 08:51:47 crc kubenswrapper[5127]: I0201 08:51:47.186065 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf015943ded2a0ba1b672bb28cf3856820e02329fa0f595acb02b74f6842110" Feb 01 08:51:47 crc kubenswrapper[5127]: I0201 08:51:47.186116 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd" Feb 01 08:51:48 crc kubenswrapper[5127]: I0201 08:51:48.199667 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdsbn" event={"ID":"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0","Type":"ContainerStarted","Data":"2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b"} Feb 01 08:51:48 crc kubenswrapper[5127]: I0201 08:51:48.232465 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bdsbn" podStartSLOduration=2.641703287 podStartE2EDuration="6.232445187s" podCreationTimestamp="2026-02-01 08:51:42 +0000 UTC" firstStartedPulling="2026-02-01 08:51:44.109293063 +0000 UTC m=+7454.595195426" lastFinishedPulling="2026-02-01 08:51:47.700034953 +0000 UTC m=+7458.185937326" observedRunningTime="2026-02-01 08:51:48.220546768 +0000 UTC m=+7458.706449131" watchObservedRunningTime="2026-02-01 08:51:48.232445187 +0000 UTC m=+7458.718347550" Feb 01 08:51:51 crc kubenswrapper[5127]: I0201 08:51:51.236296 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:51:51 crc kubenswrapper[5127]: E0201 08:51:51.237423 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:51:52 crc kubenswrapper[5127]: I0201 08:51:52.551132 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:52 crc kubenswrapper[5127]: I0201 08:51:52.551189 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:51:53 crc kubenswrapper[5127]: I0201 08:51:53.603755 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bdsbn" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" probeResult="failure" output=< Feb 01 08:51:53 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 08:51:53 crc kubenswrapper[5127]: > Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.805764 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k6m42"] Feb 01 08:51:55 crc kubenswrapper[5127]: E0201 08:51:55.806440 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerName="util" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806457 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerName="util" Feb 01 08:51:55 crc kubenswrapper[5127]: E0201 08:51:55.806481 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806488 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" Feb 01 08:51:55 crc kubenswrapper[5127]: E0201 08:51:55.806501 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerName="pull" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806510 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerName="pull" Feb 01 08:51:55 crc kubenswrapper[5127]: E0201 08:51:55.806533 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerName="extract" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806538 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerName="extract" Feb 01 08:51:55 crc kubenswrapper[5127]: E0201 08:51:55.806551 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon-log" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806558 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon-log" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806760 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806782 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e9bbef-b801-4f7e-bc75-57f80f6c06c9" containerName="horizon-log" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.806792 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9aacae9-b0d2-4661-8e56-52e562125b03" containerName="extract" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.808093 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.843892 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6m42"] Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.930261 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpl7g\" (UniqueName: \"kubernetes.io/projected/4d42abb7-5366-4a13-94cb-02608047a868-kube-api-access-fpl7g\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.930612 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-utilities\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:55 crc kubenswrapper[5127]: I0201 08:51:55.930896 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-catalog-content\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.032329 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-catalog-content\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.032419 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpl7g\" (UniqueName: \"kubernetes.io/projected/4d42abb7-5366-4a13-94cb-02608047a868-kube-api-access-fpl7g\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.032563 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-utilities\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.032824 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-catalog-content\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.032991 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-utilities\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.063084 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpl7g\" (UniqueName: \"kubernetes.io/projected/4d42abb7-5366-4a13-94cb-02608047a868-kube-api-access-fpl7g\") pod \"redhat-marketplace-k6m42\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.131245 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:51:56 crc kubenswrapper[5127]: I0201 08:51:56.637229 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6m42"] Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.291537 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d42abb7-5366-4a13-94cb-02608047a868" containerID="3ca90c7438c915dda8c508c058b555ed19dbd0d59efbcebe270882641ab9040d" exitCode=0 Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.291627 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6m42" event={"ID":"4d42abb7-5366-4a13-94cb-02608047a868","Type":"ContainerDied","Data":"3ca90c7438c915dda8c508c058b555ed19dbd0d59efbcebe270882641ab9040d"} Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.291837 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6m42" event={"ID":"4d42abb7-5366-4a13-94cb-02608047a868","Type":"ContainerStarted","Data":"b15d5579a5993823d2bb8594050a436938de503657caa3696699a298449b6def"} Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.734107 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9"] Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.736671 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.739508 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.739617 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.798172 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-w2qp6" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.798241 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9"] Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.920194 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm"] Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.944709 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.952458 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-c2hbh" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.980826 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfcfz\" (UniqueName: \"kubernetes.io/projected/b9cff7c9-e3d3-41cb-8b79-76cca738c2f6-kube-api-access-vfcfz\") pod \"obo-prometheus-operator-68bc856cb9-fl5c9\" (UID: \"b9cff7c9-e3d3-41cb-8b79-76cca738c2f6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.981800 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 01 08:51:57 crc kubenswrapper[5127]: I0201 08:51:57.987971 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm"] Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.003697 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl"] Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.007443 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.009820 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl"] Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.086234 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d9280e1-7d78-466f-a218-29bc52ab31d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm\" (UID: \"5d9280e1-7d78-466f-a218-29bc52ab31d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.086569 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfcfz\" (UniqueName: \"kubernetes.io/projected/b9cff7c9-e3d3-41cb-8b79-76cca738c2f6-kube-api-access-vfcfz\") pod \"obo-prometheus-operator-68bc856cb9-fl5c9\" (UID: \"b9cff7c9-e3d3-41cb-8b79-76cca738c2f6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.086805 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d9280e1-7d78-466f-a218-29bc52ab31d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm\" (UID: \"5d9280e1-7d78-466f-a218-29bc52ab31d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.125766 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-b4zck"] Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.132055 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.135150 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-f2tjf" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.135204 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.136650 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfcfz\" (UniqueName: \"kubernetes.io/projected/b9cff7c9-e3d3-41cb-8b79-76cca738c2f6-kube-api-access-vfcfz\") pod \"obo-prometheus-operator-68bc856cb9-fl5c9\" (UID: \"b9cff7c9-e3d3-41cb-8b79-76cca738c2f6\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.136906 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-b4zck"] Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.201196 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d9280e1-7d78-466f-a218-29bc52ab31d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm\" (UID: \"5d9280e1-7d78-466f-a218-29bc52ab31d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.201310 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d9280e1-7d78-466f-a218-29bc52ab31d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm\" (UID: \"5d9280e1-7d78-466f-a218-29bc52ab31d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.201378 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64f4b3f8-dc3b-44c5-ab17-51cec08322b0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl\" (UID: \"64f4b3f8-dc3b-44c5-ab17-51cec08322b0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.201414 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64f4b3f8-dc3b-44c5-ab17-51cec08322b0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl\" (UID: \"64f4b3f8-dc3b-44c5-ab17-51cec08322b0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.201431 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c15e3a2f-7b85-439d-8fcc-9108c58e7a9e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-b4zck\" (UID: \"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e\") " pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.201453 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zwm7\" (UniqueName: \"kubernetes.io/projected/c15e3a2f-7b85-439d-8fcc-9108c58e7a9e-kube-api-access-7zwm7\") pod \"observability-operator-59bdc8b94-b4zck\" (UID: \"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e\") " pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.215301 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d9280e1-7d78-466f-a218-29bc52ab31d5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm\" (UID: \"5d9280e1-7d78-466f-a218-29bc52ab31d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.215717 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d9280e1-7d78-466f-a218-29bc52ab31d5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm\" (UID: \"5d9280e1-7d78-466f-a218-29bc52ab31d5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.221352 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4c7dx"] Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.222974 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.224649 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-srzkk" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.270750 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4c7dx"] Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.308937 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64f4b3f8-dc3b-44c5-ab17-51cec08322b0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl\" (UID: \"64f4b3f8-dc3b-44c5-ab17-51cec08322b0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.308994 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64f4b3f8-dc3b-44c5-ab17-51cec08322b0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl\" (UID: \"64f4b3f8-dc3b-44c5-ab17-51cec08322b0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.309020 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c15e3a2f-7b85-439d-8fcc-9108c58e7a9e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-b4zck\" (UID: \"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e\") " pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.309042 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zwm7\" (UniqueName: \"kubernetes.io/projected/c15e3a2f-7b85-439d-8fcc-9108c58e7a9e-kube-api-access-7zwm7\") pod \"observability-operator-59bdc8b94-b4zck\" (UID: \"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e\") " pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.315755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/64f4b3f8-dc3b-44c5-ab17-51cec08322b0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl\" (UID: \"64f4b3f8-dc3b-44c5-ab17-51cec08322b0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.318422 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.321651 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c15e3a2f-7b85-439d-8fcc-9108c58e7a9e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-b4zck\" (UID: \"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e\") " pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.326198 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/64f4b3f8-dc3b-44c5-ab17-51cec08322b0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl\" (UID: \"64f4b3f8-dc3b-44c5-ab17-51cec08322b0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.334162 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zwm7\" (UniqueName: \"kubernetes.io/projected/c15e3a2f-7b85-439d-8fcc-9108c58e7a9e-kube-api-access-7zwm7\") pod \"observability-operator-59bdc8b94-b4zck\" (UID: \"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e\") " pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.373646 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.376782 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.410511 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e9485f5-0c8c-40cb-88de-fae715ae2f3f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4c7dx\" (UID: \"7e9485f5-0c8c-40cb-88de-fae715ae2f3f\") " pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.410603 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxpgs\" (UniqueName: \"kubernetes.io/projected/7e9485f5-0c8c-40cb-88de-fae715ae2f3f-kube-api-access-jxpgs\") pod \"perses-operator-5bf474d74f-4c7dx\" (UID: \"7e9485f5-0c8c-40cb-88de-fae715ae2f3f\") " pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.437159 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.513411 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e9485f5-0c8c-40cb-88de-fae715ae2f3f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4c7dx\" (UID: \"7e9485f5-0c8c-40cb-88de-fae715ae2f3f\") " pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.513793 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxpgs\" (UniqueName: \"kubernetes.io/projected/7e9485f5-0c8c-40cb-88de-fae715ae2f3f-kube-api-access-jxpgs\") pod \"perses-operator-5bf474d74f-4c7dx\" (UID: \"7e9485f5-0c8c-40cb-88de-fae715ae2f3f\") " pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.514827 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/7e9485f5-0c8c-40cb-88de-fae715ae2f3f-openshift-service-ca\") pod \"perses-operator-5bf474d74f-4c7dx\" (UID: \"7e9485f5-0c8c-40cb-88de-fae715ae2f3f\") " pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.532158 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxpgs\" (UniqueName: \"kubernetes.io/projected/7e9485f5-0c8c-40cb-88de-fae715ae2f3f-kube-api-access-jxpgs\") pod \"perses-operator-5bf474d74f-4c7dx\" (UID: \"7e9485f5-0c8c-40cb-88de-fae715ae2f3f\") " pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.691288 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:51:58 crc kubenswrapper[5127]: I0201 08:51:58.937874 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm"] Feb 01 08:51:58 crc kubenswrapper[5127]: W0201 08:51:58.973515 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d9280e1_7d78_466f_a218_29bc52ab31d5.slice/crio-9951153b95a79827f2dedfc96b839f70b93568fce7c80de5e7bfc2aede2e14db WatchSource:0}: Error finding container 9951153b95a79827f2dedfc96b839f70b93568fce7c80de5e7bfc2aede2e14db: Status 404 returned error can't find the container with id 9951153b95a79827f2dedfc96b839f70b93568fce7c80de5e7bfc2aede2e14db Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.064039 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl"] Feb 01 08:51:59 crc kubenswrapper[5127]: W0201 08:51:59.091791 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc15e3a2f_7b85_439d_8fcc_9108c58e7a9e.slice/crio-fee5d19b427f6c5e18377bb1c5875ab2edc7f1fe8c29a761a524ad7b3a548d84 WatchSource:0}: Error finding container fee5d19b427f6c5e18377bb1c5875ab2edc7f1fe8c29a761a524ad7b3a548d84: Status 404 returned error can't find the container with id fee5d19b427f6c5e18377bb1c5875ab2edc7f1fe8c29a761a524ad7b3a548d84 Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.097858 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-b4zck"] Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.205063 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9"] Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.319253 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-4c7dx"] Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.336538 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d42abb7-5366-4a13-94cb-02608047a868" containerID="25df119bda8077352312d99c570fd772561f2c86f91687a55050fab106653bc0" exitCode=0 Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.336904 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6m42" event={"ID":"4d42abb7-5366-4a13-94cb-02608047a868","Type":"ContainerDied","Data":"25df119bda8077352312d99c570fd772561f2c86f91687a55050fab106653bc0"} Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.344717 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" event={"ID":"64f4b3f8-dc3b-44c5-ab17-51cec08322b0","Type":"ContainerStarted","Data":"c9b55f022b0670361c0c4c085f58b76829b54c316e7ca77b8057ad6b40fd0df0"} Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.346271 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" event={"ID":"5d9280e1-7d78-466f-a218-29bc52ab31d5","Type":"ContainerStarted","Data":"9951153b95a79827f2dedfc96b839f70b93568fce7c80de5e7bfc2aede2e14db"} Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.348562 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" event={"ID":"b9cff7c9-e3d3-41cb-8b79-76cca738c2f6","Type":"ContainerStarted","Data":"df3f23d36e57d17485d99be9911af6aea7fe572f768b17ada585f366cfde4c82"} Feb 01 08:51:59 crc kubenswrapper[5127]: I0201 08:51:59.354731 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-b4zck" event={"ID":"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e","Type":"ContainerStarted","Data":"fee5d19b427f6c5e18377bb1c5875ab2edc7f1fe8c29a761a524ad7b3a548d84"} Feb 01 08:52:00 crc kubenswrapper[5127]: I0201 08:52:00.413803 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6m42" event={"ID":"4d42abb7-5366-4a13-94cb-02608047a868","Type":"ContainerStarted","Data":"438b1c30fe08faa1a3ff2636e4a6afb8911ffa8218ebb9229703942bd0d982b2"} Feb 01 08:52:00 crc kubenswrapper[5127]: I0201 08:52:00.421518 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" event={"ID":"7e9485f5-0c8c-40cb-88de-fae715ae2f3f","Type":"ContainerStarted","Data":"1ebde3b3f4d23f717ffd11a52b7ac897e05da0f5caa3e7de1c066ba993ae2cf2"} Feb 01 08:52:03 crc kubenswrapper[5127]: I0201 08:52:03.608705 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bdsbn" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" probeResult="failure" output=< Feb 01 08:52:03 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 08:52:03 crc kubenswrapper[5127]: > Feb 01 08:52:05 crc kubenswrapper[5127]: I0201 08:52:05.244402 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:52:05 crc kubenswrapper[5127]: E0201 08:52:05.245007 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:52:06 crc kubenswrapper[5127]: I0201 08:52:06.131948 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:52:06 crc kubenswrapper[5127]: I0201 08:52:06.132357 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:52:06 crc kubenswrapper[5127]: I0201 08:52:06.205778 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:52:06 crc kubenswrapper[5127]: I0201 08:52:06.249594 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k6m42" podStartSLOduration=8.788237373 podStartE2EDuration="11.249556538s" podCreationTimestamp="2026-02-01 08:51:55 +0000 UTC" firstStartedPulling="2026-02-01 08:51:57.293600506 +0000 UTC m=+7467.779502869" lastFinishedPulling="2026-02-01 08:51:59.754919671 +0000 UTC m=+7470.240822034" observedRunningTime="2026-02-01 08:52:00.552637123 +0000 UTC m=+7471.038539486" watchObservedRunningTime="2026-02-01 08:52:06.249556538 +0000 UTC m=+7476.735458901" Feb 01 08:52:06 crc kubenswrapper[5127]: I0201 08:52:06.583914 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:52:07 crc kubenswrapper[5127]: I0201 08:52:07.805933 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6m42"] Feb 01 08:52:08 crc kubenswrapper[5127]: I0201 08:52:08.504890 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k6m42" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="registry-server" containerID="cri-o://438b1c30fe08faa1a3ff2636e4a6afb8911ffa8218ebb9229703942bd0d982b2" gracePeriod=2 Feb 01 08:52:09 crc kubenswrapper[5127]: I0201 08:52:09.096645 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a9db-account-create-update-6zsjs"] Feb 01 08:52:09 crc kubenswrapper[5127]: I0201 08:52:09.104719 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a9db-account-create-update-6zsjs"] Feb 01 08:52:09 crc kubenswrapper[5127]: I0201 08:52:09.519435 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d42abb7-5366-4a13-94cb-02608047a868" containerID="438b1c30fe08faa1a3ff2636e4a6afb8911ffa8218ebb9229703942bd0d982b2" exitCode=0 Feb 01 08:52:09 crc kubenswrapper[5127]: I0201 08:52:09.519488 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6m42" event={"ID":"4d42abb7-5366-4a13-94cb-02608047a868","Type":"ContainerDied","Data":"438b1c30fe08faa1a3ff2636e4a6afb8911ffa8218ebb9229703942bd0d982b2"} Feb 01 08:52:10 crc kubenswrapper[5127]: I0201 08:52:10.028777 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6rrqz"] Feb 01 08:52:10 crc kubenswrapper[5127]: I0201 08:52:10.039795 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6rrqz"] Feb 01 08:52:10 crc kubenswrapper[5127]: I0201 08:52:10.247300 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc1de06-1d9c-4c95-979d-5c609693a411" path="/var/lib/kubelet/pods/8cc1de06-1d9c-4c95-979d-5c609693a411/volumes" Feb 01 08:52:10 crc kubenswrapper[5127]: I0201 08:52:10.247863 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75d0138-b829-4d7b-add2-c80532012c2d" path="/var/lib/kubelet/pods/c75d0138-b829-4d7b-add2-c80532012c2d/volumes" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.511498 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.541645 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" event={"ID":"64f4b3f8-dc3b-44c5-ab17-51cec08322b0","Type":"ContainerStarted","Data":"ca6de02a6fd2de3b61546d3dc818264eaf7832d6cab3a210c2e63d848a152714"} Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.543929 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k6m42" event={"ID":"4d42abb7-5366-4a13-94cb-02608047a868","Type":"ContainerDied","Data":"b15d5579a5993823d2bb8594050a436938de503657caa3696699a298449b6def"} Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.543979 5127 scope.go:117] "RemoveContainer" containerID="438b1c30fe08faa1a3ff2636e4a6afb8911ffa8218ebb9229703942bd0d982b2" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.544092 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k6m42" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.586735 5127 scope.go:117] "RemoveContainer" containerID="25df119bda8077352312d99c570fd772561f2c86f91687a55050fab106653bc0" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.592087 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl" podStartSLOduration=2.607342047 podStartE2EDuration="14.59206631s" podCreationTimestamp="2026-02-01 08:51:57 +0000 UTC" firstStartedPulling="2026-02-01 08:51:59.083842653 +0000 UTC m=+7469.569745016" lastFinishedPulling="2026-02-01 08:52:11.068566916 +0000 UTC m=+7481.554469279" observedRunningTime="2026-02-01 08:52:11.584210779 +0000 UTC m=+7482.070113142" watchObservedRunningTime="2026-02-01 08:52:11.59206631 +0000 UTC m=+7482.077968673" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.606140 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-catalog-content\") pod \"4d42abb7-5366-4a13-94cb-02608047a868\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.606221 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpl7g\" (UniqueName: \"kubernetes.io/projected/4d42abb7-5366-4a13-94cb-02608047a868-kube-api-access-fpl7g\") pod \"4d42abb7-5366-4a13-94cb-02608047a868\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.606381 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-utilities\") pod \"4d42abb7-5366-4a13-94cb-02608047a868\" (UID: \"4d42abb7-5366-4a13-94cb-02608047a868\") " Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.616237 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d42abb7-5366-4a13-94cb-02608047a868-kube-api-access-fpl7g" (OuterVolumeSpecName: "kube-api-access-fpl7g") pod "4d42abb7-5366-4a13-94cb-02608047a868" (UID: "4d42abb7-5366-4a13-94cb-02608047a868"). InnerVolumeSpecName "kube-api-access-fpl7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.616517 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-utilities" (OuterVolumeSpecName: "utilities") pod "4d42abb7-5366-4a13-94cb-02608047a868" (UID: "4d42abb7-5366-4a13-94cb-02608047a868"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.657755 5127 scope.go:117] "RemoveContainer" containerID="3ca90c7438c915dda8c508c058b555ed19dbd0d59efbcebe270882641ab9040d" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.674752 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d42abb7-5366-4a13-94cb-02608047a868" (UID: "4d42abb7-5366-4a13-94cb-02608047a868"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.713082 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.713113 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d42abb7-5366-4a13-94cb-02608047a868-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.713124 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpl7g\" (UniqueName: \"kubernetes.io/projected/4d42abb7-5366-4a13-94cb-02608047a868-kube-api-access-fpl7g\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.896660 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6m42"] Feb 01 08:52:11 crc kubenswrapper[5127]: I0201 08:52:11.912135 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k6m42"] Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.278494 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d42abb7-5366-4a13-94cb-02608047a868" path="/var/lib/kubelet/pods/4d42abb7-5366-4a13-94cb-02608047a868/volumes" Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.554038 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-b4zck" event={"ID":"c15e3a2f-7b85-439d-8fcc-9108c58e7a9e","Type":"ContainerStarted","Data":"86602a7d6371d8a282c3883afc208b8505d5cee76c420526e39e6614192269d2"} Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.555524 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.562736 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" event={"ID":"5d9280e1-7d78-466f-a218-29bc52ab31d5","Type":"ContainerStarted","Data":"d3f8e3bc223f9195cc2b42d9948420a8818b7c3c4a7af07519072c75c81562cd"} Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.564868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" event={"ID":"7e9485f5-0c8c-40cb-88de-fae715ae2f3f","Type":"ContainerStarted","Data":"beb2c658b2efe1eff5ea83bfa3d56a496946950098f9a6b13ccb79ca941a2ca3"} Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.565296 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.567151 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" event={"ID":"b9cff7c9-e3d3-41cb-8b79-76cca738c2f6","Type":"ContainerStarted","Data":"2aca1f475fb687d9fa4dca885ad1cdd0421429bc9a3a8a1673314dfd5d6c4368"} Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.581277 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-b4zck" Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.601705 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-b4zck" podStartSLOduration=2.5890553609999998 podStartE2EDuration="14.601687104s" podCreationTimestamp="2026-02-01 08:51:58 +0000 UTC" firstStartedPulling="2026-02-01 08:51:59.09454523 +0000 UTC m=+7469.580447593" lastFinishedPulling="2026-02-01 08:52:11.107176973 +0000 UTC m=+7481.593079336" observedRunningTime="2026-02-01 08:52:12.581467971 +0000 UTC m=+7483.067370324" watchObservedRunningTime="2026-02-01 08:52:12.601687104 +0000 UTC m=+7483.087589467" Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.602327 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" podStartSLOduration=2.85813945 podStartE2EDuration="14.602321711s" podCreationTimestamp="2026-02-01 08:51:58 +0000 UTC" firstStartedPulling="2026-02-01 08:51:59.325521255 +0000 UTC m=+7469.811423618" lastFinishedPulling="2026-02-01 08:52:11.069703516 +0000 UTC m=+7481.555605879" observedRunningTime="2026-02-01 08:52:12.596739461 +0000 UTC m=+7483.082641824" watchObservedRunningTime="2026-02-01 08:52:12.602321711 +0000 UTC m=+7483.088224084" Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.620942 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm" podStartSLOduration=3.481393659 podStartE2EDuration="15.620920781s" podCreationTimestamp="2026-02-01 08:51:57 +0000 UTC" firstStartedPulling="2026-02-01 08:51:58.977258829 +0000 UTC m=+7469.463161192" lastFinishedPulling="2026-02-01 08:52:11.116785951 +0000 UTC m=+7481.602688314" observedRunningTime="2026-02-01 08:52:12.617304944 +0000 UTC m=+7483.103207307" watchObservedRunningTime="2026-02-01 08:52:12.620920781 +0000 UTC m=+7483.106823144" Feb 01 08:52:12 crc kubenswrapper[5127]: I0201 08:52:12.684621 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-fl5c9" podStartSLOduration=3.8226562680000002 podStartE2EDuration="15.684600152s" podCreationTimestamp="2026-02-01 08:51:57 +0000 UTC" firstStartedPulling="2026-02-01 08:51:59.218944282 +0000 UTC m=+7469.704846645" lastFinishedPulling="2026-02-01 08:52:11.080888166 +0000 UTC m=+7481.566790529" observedRunningTime="2026-02-01 08:52:12.676497594 +0000 UTC m=+7483.162399957" watchObservedRunningTime="2026-02-01 08:52:12.684600152 +0000 UTC m=+7483.170502515" Feb 01 08:52:13 crc kubenswrapper[5127]: I0201 08:52:13.601801 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bdsbn" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" probeResult="failure" output=< Feb 01 08:52:13 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 08:52:13 crc kubenswrapper[5127]: > Feb 01 08:52:18 crc kubenswrapper[5127]: I0201 08:52:18.074762 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-z5d2n"] Feb 01 08:52:18 crc kubenswrapper[5127]: I0201 08:52:18.091668 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-z5d2n"] Feb 01 08:52:18 crc kubenswrapper[5127]: I0201 08:52:18.235675 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:52:18 crc kubenswrapper[5127]: E0201 08:52:18.236140 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:52:18 crc kubenswrapper[5127]: I0201 08:52:18.245521 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7056c1-d866-4310-abe7-fd62cab68866" path="/var/lib/kubelet/pods/2f7056c1-d866-4310-abe7-fd62cab68866/volumes" Feb 01 08:52:18 crc kubenswrapper[5127]: I0201 08:52:18.693885 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-4c7dx" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.560520 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.561041 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="de6aab82-98b8-4090-bb46-192a713ee9a8" containerName="openstackclient" containerID="cri-o://4674861e140aa628f122b3cb74c8778f7bee29b537e33ed1adbe029b312855fa" gracePeriod=2 Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.571502 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.615825 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 01 08:52:21 crc kubenswrapper[5127]: E0201 08:52:21.616255 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de6aab82-98b8-4090-bb46-192a713ee9a8" containerName="openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.616274 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="de6aab82-98b8-4090-bb46-192a713ee9a8" containerName="openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: E0201 08:52:21.616296 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="extract-utilities" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.616303 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="extract-utilities" Feb 01 08:52:21 crc kubenswrapper[5127]: E0201 08:52:21.616331 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="registry-server" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.616337 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="registry-server" Feb 01 08:52:21 crc kubenswrapper[5127]: E0201 08:52:21.616355 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="extract-content" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.616361 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="extract-content" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.616541 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="de6aab82-98b8-4090-bb46-192a713ee9a8" containerName="openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.616553 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d42abb7-5366-4a13-94cb-02608047a868" containerName="registry-server" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.617557 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.627232 5127 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="de6aab82-98b8-4090-bb46-192a713ee9a8" podUID="6dc16ed6-1e14-435e-bafa-e151fce8a2bd" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.640874 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.707427 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-openstack-config\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.707811 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.707848 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zzqm\" (UniqueName: \"kubernetes.io/projected/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-kube-api-access-6zzqm\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.772199 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.773403 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.776413 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rk888" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.786268 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.819728 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n25w\" (UniqueName: \"kubernetes.io/projected/8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25-kube-api-access-4n25w\") pod \"kube-state-metrics-0\" (UID: \"8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25\") " pod="openstack/kube-state-metrics-0" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.819808 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-openstack-config\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.819842 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.819868 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zzqm\" (UniqueName: \"kubernetes.io/projected/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-kube-api-access-6zzqm\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.820979 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-openstack-config\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.843231 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-openstack-config-secret\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.874024 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zzqm\" (UniqueName: \"kubernetes.io/projected/6dc16ed6-1e14-435e-bafa-e151fce8a2bd-kube-api-access-6zzqm\") pod \"openstackclient\" (UID: \"6dc16ed6-1e14-435e-bafa-e151fce8a2bd\") " pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.921828 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n25w\" (UniqueName: \"kubernetes.io/projected/8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25-kube-api-access-4n25w\") pod \"kube-state-metrics-0\" (UID: \"8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25\") " pod="openstack/kube-state-metrics-0" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.934171 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 08:52:21 crc kubenswrapper[5127]: I0201 08:52:21.947436 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n25w\" (UniqueName: \"kubernetes.io/projected/8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25-kube-api-access-4n25w\") pod \"kube-state-metrics-0\" (UID: \"8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25\") " pod="openstack/kube-state-metrics-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.098177 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.792046 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.827291 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.829111 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.834673 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-fsp4b" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.836300 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.836519 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.836668 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.836840 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.869395 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.869450 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.869502 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.869531 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.869566 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m8lj\" (UniqueName: \"kubernetes.io/projected/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-kube-api-access-2m8lj\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.869637 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.869764 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.969403 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.975223 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.975341 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.975383 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.975399 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.975433 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.975456 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.975478 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m8lj\" (UniqueName: \"kubernetes.io/projected/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-kube-api-access-2m8lj\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.977335 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.982446 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.983211 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.986016 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:22 crc kubenswrapper[5127]: I0201 08:52:22.994625 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.005902 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.064167 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m8lj\" (UniqueName: \"kubernetes.io/projected/d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4-kube-api-access-2m8lj\") pod \"alertmanager-metric-storage-0\" (UID: \"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4\") " pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.113665 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.254953 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.275041 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.284419 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.285089 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.285348 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.285565 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.285803 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.286088 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.292005 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-ngvr2" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.292217 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.350347 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.364122 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.403999 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404060 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzzqz\" (UniqueName: \"kubernetes.io/projected/ee773d63-9213-4566-a575-e86b796fa167-kube-api-access-fzzqz\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404103 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ee773d63-9213-4566-a575-e86b796fa167-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404356 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-config\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404385 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404409 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404447 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404472 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404499 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ee773d63-9213-4566-a575-e86b796fa167-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.404536 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.510775 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511052 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzzqz\" (UniqueName: \"kubernetes.io/projected/ee773d63-9213-4566-a575-e86b796fa167-kube-api-access-fzzqz\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511086 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ee773d63-9213-4566-a575-e86b796fa167-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511123 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-config\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511141 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511158 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511185 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511201 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511219 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ee773d63-9213-4566-a575-e86b796fa167-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.511248 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.515937 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.516338 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.516551 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.517094 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/ee773d63-9213-4566-a575-e86b796fa167-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.520946 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ee773d63-9213-4566-a575-e86b796fa167-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.550088 5127 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.550136 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a6215b2d46a512657ec0e5b2863e4dc274666ccbc8a461170e019840161db95/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.559466 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ee773d63-9213-4566-a575-e86b796fa167-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.562966 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-config\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.565976 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ee773d63-9213-4566-a575-e86b796fa167-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.570165 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzzqz\" (UniqueName: \"kubernetes.io/projected/ee773d63-9213-4566-a575-e86b796fa167-kube-api-access-fzzqz\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.632880 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d8f2ed94-03eb-47c1-b5a6-2ca682e03c20\") pod \"prometheus-metric-storage-0\" (UID: \"ee773d63-9213-4566-a575-e86b796fa167\") " pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.653298 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.660818 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bdsbn" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" probeResult="failure" output=< Feb 01 08:52:23 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 08:52:23 crc kubenswrapper[5127]: > Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.705645 5127 generic.go:334] "Generic (PLEG): container finished" podID="de6aab82-98b8-4090-bb46-192a713ee9a8" containerID="4674861e140aa628f122b3cb74c8778f7bee29b537e33ed1adbe029b312855fa" exitCode=137 Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.708441 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6dc16ed6-1e14-435e-bafa-e151fce8a2bd","Type":"ContainerStarted","Data":"eae2b3e5c1d03b447ab4b071556cd65e598fcf5121ccb93d2f98a11d444aefe1"} Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.708473 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"6dc16ed6-1e14-435e-bafa-e151fce8a2bd","Type":"ContainerStarted","Data":"80c197e4d8283eafea096468b51cd206af90310dc074208cd666d22969dfc15a"} Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.721932 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25","Type":"ContainerStarted","Data":"a78af3352d0e36402f17ac96a4a5f0faa5060395db7bd752720e0d4af495fcb6"} Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.734927 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.734895741 podStartE2EDuration="2.734895741s" podCreationTimestamp="2026-02-01 08:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:52:23.725233881 +0000 UTC m=+7494.211136245" watchObservedRunningTime="2026-02-01 08:52:23.734895741 +0000 UTC m=+7494.220798104" Feb 01 08:52:23 crc kubenswrapper[5127]: I0201 08:52:23.917497 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.248897 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.325511 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.330273 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws27h\" (UniqueName: \"kubernetes.io/projected/de6aab82-98b8-4090-bb46-192a713ee9a8-kube-api-access-ws27h\") pod \"de6aab82-98b8-4090-bb46-192a713ee9a8\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.330344 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config\") pod \"de6aab82-98b8-4090-bb46-192a713ee9a8\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.330453 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config-secret\") pod \"de6aab82-98b8-4090-bb46-192a713ee9a8\" (UID: \"de6aab82-98b8-4090-bb46-192a713ee9a8\") " Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.339324 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de6aab82-98b8-4090-bb46-192a713ee9a8-kube-api-access-ws27h" (OuterVolumeSpecName: "kube-api-access-ws27h") pod "de6aab82-98b8-4090-bb46-192a713ee9a8" (UID: "de6aab82-98b8-4090-bb46-192a713ee9a8"). InnerVolumeSpecName "kube-api-access-ws27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.367935 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "de6aab82-98b8-4090-bb46-192a713ee9a8" (UID: "de6aab82-98b8-4090-bb46-192a713ee9a8"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.420720 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "de6aab82-98b8-4090-bb46-192a713ee9a8" (UID: "de6aab82-98b8-4090-bb46-192a713ee9a8"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.432350 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws27h\" (UniqueName: \"kubernetes.io/projected/de6aab82-98b8-4090-bb46-192a713ee9a8-kube-api-access-ws27h\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.432387 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.432397 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/de6aab82-98b8-4090-bb46-192a713ee9a8-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.730820 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4","Type":"ContainerStarted","Data":"f27c39b1d9369d90978f05bef97c828350ce7540adb30e412292a1d50106f1c2"} Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.732508 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25","Type":"ContainerStarted","Data":"13fd0ab58bde3a9e5c54be6ec7abe2ed8d285cddff4bfe69463180041ba8f6ca"} Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.733743 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.734856 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ee773d63-9213-4566-a575-e86b796fa167","Type":"ContainerStarted","Data":"ddf06c97931deda4be0decc0708ab58d85a3623faf8f6161c3c7904d26016ca6"} Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.737064 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.740725 5127 scope.go:117] "RemoveContainer" containerID="4674861e140aa628f122b3cb74c8778f7bee29b537e33ed1adbe029b312855fa" Feb 01 08:52:24 crc kubenswrapper[5127]: I0201 08:52:24.778599 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.080291221 podStartE2EDuration="3.778567741s" podCreationTimestamp="2026-02-01 08:52:21 +0000 UTC" firstStartedPulling="2026-02-01 08:52:23.29714073 +0000 UTC m=+7493.783043093" lastFinishedPulling="2026-02-01 08:52:23.99541725 +0000 UTC m=+7494.481319613" observedRunningTime="2026-02-01 08:52:24.775816487 +0000 UTC m=+7495.261718850" watchObservedRunningTime="2026-02-01 08:52:24.778567741 +0000 UTC m=+7495.264470094" Feb 01 08:52:26 crc kubenswrapper[5127]: I0201 08:52:26.248059 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de6aab82-98b8-4090-bb46-192a713ee9a8" path="/var/lib/kubelet/pods/de6aab82-98b8-4090-bb46-192a713ee9a8/volumes" Feb 01 08:52:29 crc kubenswrapper[5127]: I0201 08:52:29.798269 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ee773d63-9213-4566-a575-e86b796fa167","Type":"ContainerStarted","Data":"a38fd48f94e369aa8891a709f200d24c118bcfe335d432834f4d78a024a7294d"} Feb 01 08:52:30 crc kubenswrapper[5127]: I0201 08:52:30.811174 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4","Type":"ContainerStarted","Data":"4c51004587e35c703fcb46b2a50d9430ddf5ce6fc03687320a1231dd2d7e21d1"} Feb 01 08:52:31 crc kubenswrapper[5127]: I0201 08:52:31.236062 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:52:31 crc kubenswrapper[5127]: E0201 08:52:31.236488 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:52:32 crc kubenswrapper[5127]: I0201 08:52:32.188690 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 01 08:52:32 crc kubenswrapper[5127]: I0201 08:52:32.634666 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:52:32 crc kubenswrapper[5127]: I0201 08:52:32.687006 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:52:32 crc kubenswrapper[5127]: I0201 08:52:32.884649 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdsbn"] Feb 01 08:52:33 crc kubenswrapper[5127]: I0201 08:52:33.855552 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bdsbn" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" containerID="cri-o://2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b" gracePeriod=2 Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.532946 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.679217 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdxzn\" (UniqueName: \"kubernetes.io/projected/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-kube-api-access-hdxzn\") pod \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.679417 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-catalog-content\") pod \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.679435 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-utilities\") pod \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\" (UID: \"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0\") " Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.681009 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-utilities" (OuterVolumeSpecName: "utilities") pod "76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" (UID: "76dc6ff2-adb5-4c24-8f5e-433040a4d4b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.684746 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-kube-api-access-hdxzn" (OuterVolumeSpecName: "kube-api-access-hdxzn") pod "76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" (UID: "76dc6ff2-adb5-4c24-8f5e-433040a4d4b0"). InnerVolumeSpecName "kube-api-access-hdxzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.785988 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.786034 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdxzn\" (UniqueName: \"kubernetes.io/projected/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-kube-api-access-hdxzn\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.800846 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" (UID: "76dc6ff2-adb5-4c24-8f5e-433040a4d4b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.865950 5127 generic.go:334] "Generic (PLEG): container finished" podID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerID="2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b" exitCode=0 Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.865991 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdsbn" event={"ID":"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0","Type":"ContainerDied","Data":"2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b"} Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.866026 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdsbn" event={"ID":"76dc6ff2-adb5-4c24-8f5e-433040a4d4b0","Type":"ContainerDied","Data":"614fdaec4f9d0c93733849017decfc2a6b7882b8c5ec9d7117dcde03ff559125"} Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.866027 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdsbn" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.866043 5127 scope.go:117] "RemoveContainer" containerID="2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.888501 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.900309 5127 scope.go:117] "RemoveContainer" containerID="ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.911316 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdsbn"] Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.924997 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bdsbn"] Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.939228 5127 scope.go:117] "RemoveContainer" containerID="0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.977418 5127 scope.go:117] "RemoveContainer" containerID="2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b" Feb 01 08:52:34 crc kubenswrapper[5127]: E0201 08:52:34.978875 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b\": container with ID starting with 2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b not found: ID does not exist" containerID="2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.978905 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b"} err="failed to get container status \"2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b\": rpc error: code = NotFound desc = could not find container \"2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b\": container with ID starting with 2beea6a799dd1b40f7bb228a8f53b4fd53ea729303831b6178ac217a45deb02b not found: ID does not exist" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.978927 5127 scope.go:117] "RemoveContainer" containerID="ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33" Feb 01 08:52:34 crc kubenswrapper[5127]: E0201 08:52:34.979133 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33\": container with ID starting with ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33 not found: ID does not exist" containerID="ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.979149 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33"} err="failed to get container status \"ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33\": rpc error: code = NotFound desc = could not find container \"ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33\": container with ID starting with ee29537e16dbd8997f24d9493887dd24e95f18c3f2bd53e9d0a7190ef1d41a33 not found: ID does not exist" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.979162 5127 scope.go:117] "RemoveContainer" containerID="0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284" Feb 01 08:52:34 crc kubenswrapper[5127]: E0201 08:52:34.979334 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284\": container with ID starting with 0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284 not found: ID does not exist" containerID="0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284" Feb 01 08:52:34 crc kubenswrapper[5127]: I0201 08:52:34.979351 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284"} err="failed to get container status \"0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284\": rpc error: code = NotFound desc = could not find container \"0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284\": container with ID starting with 0b8dbf2757fa7efd6cc76e5422d5d2acbccdb0b7c1a8d67312be807112aa3284 not found: ID does not exist" Feb 01 08:52:36 crc kubenswrapper[5127]: I0201 08:52:36.254663 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" path="/var/lib/kubelet/pods/76dc6ff2-adb5-4c24-8f5e-433040a4d4b0/volumes" Feb 01 08:52:37 crc kubenswrapper[5127]: I0201 08:52:37.905143 5127 generic.go:334] "Generic (PLEG): container finished" podID="d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4" containerID="4c51004587e35c703fcb46b2a50d9430ddf5ce6fc03687320a1231dd2d7e21d1" exitCode=0 Feb 01 08:52:37 crc kubenswrapper[5127]: I0201 08:52:37.905301 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4","Type":"ContainerDied","Data":"4c51004587e35c703fcb46b2a50d9430ddf5ce6fc03687320a1231dd2d7e21d1"} Feb 01 08:52:37 crc kubenswrapper[5127]: I0201 08:52:37.912936 5127 generic.go:334] "Generic (PLEG): container finished" podID="ee773d63-9213-4566-a575-e86b796fa167" containerID="a38fd48f94e369aa8891a709f200d24c118bcfe335d432834f4d78a024a7294d" exitCode=0 Feb 01 08:52:37 crc kubenswrapper[5127]: I0201 08:52:37.912998 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ee773d63-9213-4566-a575-e86b796fa167","Type":"ContainerDied","Data":"a38fd48f94e369aa8891a709f200d24c118bcfe335d432834f4d78a024a7294d"} Feb 01 08:52:40 crc kubenswrapper[5127]: I0201 08:52:40.958530 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4","Type":"ContainerStarted","Data":"e9558658457a67f1e25f3e56c7480f5edcaa9550f53b760535b1b036d2e5bf53"} Feb 01 08:52:44 crc kubenswrapper[5127]: I0201 08:52:44.469386 5127 scope.go:117] "RemoveContainer" containerID="d6ba0be7c47ffbdb522bfc579d6a9b4d1a5c7e96c967b09176983bd26cd56f4a" Feb 01 08:52:45 crc kubenswrapper[5127]: I0201 08:52:45.236781 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:52:45 crc kubenswrapper[5127]: E0201 08:52:45.237355 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:52:45 crc kubenswrapper[5127]: I0201 08:52:45.912325 5127 scope.go:117] "RemoveContainer" containerID="362d988508c520e49c106c4f15ee9025a16776bd1b9ba5bc6ea277195249315e" Feb 01 08:52:46 crc kubenswrapper[5127]: I0201 08:52:46.018140 5127 scope.go:117] "RemoveContainer" containerID="4f9dfa6b6acfa850ec7a5506c38ba79e71f448b610e2911ca672c7314de9c33b" Feb 01 08:52:46 crc kubenswrapper[5127]: I0201 08:52:46.035272 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4","Type":"ContainerStarted","Data":"226862adcea01f7019d0753df0c7db52a07898671099251549c50872167fd816"} Feb 01 08:52:46 crc kubenswrapper[5127]: I0201 08:52:46.096313 5127 scope.go:117] "RemoveContainer" containerID="82aec21c60360ef43c9ee5de5258cab62b87f4e99a37a14fb0cbc052b4bc4ef2" Feb 01 08:52:47 crc kubenswrapper[5127]: I0201 08:52:47.054164 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ee773d63-9213-4566-a575-e86b796fa167","Type":"ContainerStarted","Data":"8df7670cecaf283e13bffa3daf967351be163a5885482ff939694a0ac93803d1"} Feb 01 08:52:47 crc kubenswrapper[5127]: I0201 08:52:47.054999 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:47 crc kubenswrapper[5127]: I0201 08:52:47.058697 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 01 08:52:47 crc kubenswrapper[5127]: I0201 08:52:47.103271 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.800720892 podStartE2EDuration="25.103245609s" podCreationTimestamp="2026-02-01 08:52:22 +0000 UTC" firstStartedPulling="2026-02-01 08:52:23.987106936 +0000 UTC m=+7494.473009289" lastFinishedPulling="2026-02-01 08:52:40.289631643 +0000 UTC m=+7510.775534006" observedRunningTime="2026-02-01 08:52:47.098294406 +0000 UTC m=+7517.584196809" watchObservedRunningTime="2026-02-01 08:52:47.103245609 +0000 UTC m=+7517.589148002" Feb 01 08:52:51 crc kubenswrapper[5127]: I0201 08:52:51.103388 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ee773d63-9213-4566-a575-e86b796fa167","Type":"ContainerStarted","Data":"e9517234e498475f450c4c4ef691e021b7ae16c9f4d3e7275848981220d4b6c7"} Feb 01 08:52:55 crc kubenswrapper[5127]: I0201 08:52:55.167691 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"ee773d63-9213-4566-a575-e86b796fa167","Type":"ContainerStarted","Data":"104f26eecd30660301daac49579e7be0a7a96bf002ce570acfd35fb8f847ed7f"} Feb 01 08:52:55 crc kubenswrapper[5127]: I0201 08:52:55.198825 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.284948415 podStartE2EDuration="33.198804486s" podCreationTimestamp="2026-02-01 08:52:22 +0000 UTC" firstStartedPulling="2026-02-01 08:52:24.371067773 +0000 UTC m=+7494.856970136" lastFinishedPulling="2026-02-01 08:52:54.284923824 +0000 UTC m=+7524.770826207" observedRunningTime="2026-02-01 08:52:55.197999664 +0000 UTC m=+7525.683902027" watchObservedRunningTime="2026-02-01 08:52:55.198804486 +0000 UTC m=+7525.684706849" Feb 01 08:52:58 crc kubenswrapper[5127]: I0201 08:52:58.236380 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:52:58 crc kubenswrapper[5127]: E0201 08:52:58.237749 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:52:58 crc kubenswrapper[5127]: I0201 08:52:58.654930 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 01 08:53:00 crc kubenswrapper[5127]: I0201 08:53:00.075547 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-21bb-account-create-update-ks2wx"] Feb 01 08:53:00 crc kubenswrapper[5127]: I0201 08:53:00.087203 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9mdcv"] Feb 01 08:53:00 crc kubenswrapper[5127]: I0201 08:53:00.095205 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-21bb-account-create-update-ks2wx"] Feb 01 08:53:00 crc kubenswrapper[5127]: I0201 08:53:00.103458 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9mdcv"] Feb 01 08:53:00 crc kubenswrapper[5127]: I0201 08:53:00.248671 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d6ff443-99a7-45a7-8fae-ed6495357ab0" path="/var/lib/kubelet/pods/0d6ff443-99a7-45a7-8fae-ed6495357ab0/volumes" Feb 01 08:53:00 crc kubenswrapper[5127]: I0201 08:53:00.250473 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13dd2e31-d7bd-4984-bc91-480330e01ed1" path="/var/lib/kubelet/pods/13dd2e31-d7bd-4984-bc91-480330e01ed1/volumes" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.481719 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:01 crc kubenswrapper[5127]: E0201 08:53:01.482169 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="extract-content" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.482187 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="extract-content" Feb 01 08:53:01 crc kubenswrapper[5127]: E0201 08:53:01.482220 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.482231 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" Feb 01 08:53:01 crc kubenswrapper[5127]: E0201 08:53:01.482269 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="extract-utilities" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.482282 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="extract-utilities" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.482533 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dc6ff2-adb5-4c24-8f5e-433040a4d4b0" containerName="registry-server" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.485140 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.487876 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.489217 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.531956 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.643138 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.643202 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-scripts\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.643223 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-config-data\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.643240 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.643267 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.643366 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.643386 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42v72\" (UniqueName: \"kubernetes.io/projected/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-kube-api-access-42v72\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.745321 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-scripts\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.745368 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-config-data\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.745396 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.745426 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.745497 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.745519 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42v72\" (UniqueName: \"kubernetes.io/projected/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-kube-api-access-42v72\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.745617 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.746070 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-run-httpd\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.747421 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-log-httpd\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.753073 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.756059 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-config-data\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.758957 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-scripts\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.763648 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42v72\" (UniqueName: \"kubernetes.io/projected/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-kube-api-access-42v72\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.770872 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " pod="openstack/ceilometer-0" Feb 01 08:53:01 crc kubenswrapper[5127]: I0201 08:53:01.831811 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:53:02 crc kubenswrapper[5127]: I0201 08:53:02.340039 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:02 crc kubenswrapper[5127]: W0201 08:53:02.344205 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce6b9dce_87e1_4788_ad0f_62202c1d2ef6.slice/crio-031ba0eb58b89f4e41c40d1ba38f199c86fef097b5f27cba3c7192e5d28b8f86 WatchSource:0}: Error finding container 031ba0eb58b89f4e41c40d1ba38f199c86fef097b5f27cba3c7192e5d28b8f86: Status 404 returned error can't find the container with id 031ba0eb58b89f4e41c40d1ba38f199c86fef097b5f27cba3c7192e5d28b8f86 Feb 01 08:53:03 crc kubenswrapper[5127]: I0201 08:53:03.279228 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerStarted","Data":"031ba0eb58b89f4e41c40d1ba38f199c86fef097b5f27cba3c7192e5d28b8f86"} Feb 01 08:53:07 crc kubenswrapper[5127]: I0201 08:53:07.321306 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerStarted","Data":"0a7badc1a06a46ac9174424344c02d89f78cbdd3ce09bc80919713ee4ac9e823"} Feb 01 08:53:08 crc kubenswrapper[5127]: I0201 08:53:08.335922 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerStarted","Data":"68e0e6dff98419c8db624beacab38238758bb0acfb839cbd1d2623f45684b74a"} Feb 01 08:53:08 crc kubenswrapper[5127]: I0201 08:53:08.654309 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 01 08:53:08 crc kubenswrapper[5127]: I0201 08:53:08.657850 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 01 08:53:09 crc kubenswrapper[5127]: I0201 08:53:09.346032 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerStarted","Data":"bee0c921be2ee8396b0f6b5247af52d5007c1b2cacb9f22613676c520d165b0b"} Feb 01 08:53:09 crc kubenswrapper[5127]: I0201 08:53:09.347650 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 01 08:53:11 crc kubenswrapper[5127]: I0201 08:53:11.364095 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerStarted","Data":"cab38501d75c76307aa9e3889abf1afc81c1433236c8c7e9b4f9ce70606a253f"} Feb 01 08:53:11 crc kubenswrapper[5127]: I0201 08:53:11.364630 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 08:53:11 crc kubenswrapper[5127]: I0201 08:53:11.394827 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.200650755 podStartE2EDuration="10.39480697s" podCreationTimestamp="2026-02-01 08:53:01 +0000 UTC" firstStartedPulling="2026-02-01 08:53:02.346277481 +0000 UTC m=+7532.832179844" lastFinishedPulling="2026-02-01 08:53:10.540433686 +0000 UTC m=+7541.026336059" observedRunningTime="2026-02-01 08:53:11.38772398 +0000 UTC m=+7541.873626343" watchObservedRunningTime="2026-02-01 08:53:11.39480697 +0000 UTC m=+7541.880709343" Feb 01 08:53:12 crc kubenswrapper[5127]: I0201 08:53:12.235921 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:53:12 crc kubenswrapper[5127]: E0201 08:53:12.236249 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.381426 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-lcpbf"] Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.383379 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.395259 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ck7j\" (UniqueName: \"kubernetes.io/projected/b0a45274-c5ce-49fa-a520-d69f46c46b93-kube-api-access-4ck7j\") pod \"aodh-db-create-lcpbf\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.395710 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a45274-c5ce-49fa-a520-d69f46c46b93-operator-scripts\") pod \"aodh-db-create-lcpbf\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.395288 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-lcpbf"] Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.480928 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-84b9-account-create-update-zttlk"] Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.482274 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.486173 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.493364 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-84b9-account-create-update-zttlk"] Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.497534 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ck7j\" (UniqueName: \"kubernetes.io/projected/b0a45274-c5ce-49fa-a520-d69f46c46b93-kube-api-access-4ck7j\") pod \"aodh-db-create-lcpbf\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.497574 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a45274-c5ce-49fa-a520-d69f46c46b93-operator-scripts\") pod \"aodh-db-create-lcpbf\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.497612 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e93aa97d-5839-45fb-91a0-dd94e95495fa-operator-scripts\") pod \"aodh-84b9-account-create-update-zttlk\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.497979 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9lwn\" (UniqueName: \"kubernetes.io/projected/e93aa97d-5839-45fb-91a0-dd94e95495fa-kube-api-access-z9lwn\") pod \"aodh-84b9-account-create-update-zttlk\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.498802 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a45274-c5ce-49fa-a520-d69f46c46b93-operator-scripts\") pod \"aodh-db-create-lcpbf\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.516707 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ck7j\" (UniqueName: \"kubernetes.io/projected/b0a45274-c5ce-49fa-a520-d69f46c46b93-kube-api-access-4ck7j\") pod \"aodh-db-create-lcpbf\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.599764 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e93aa97d-5839-45fb-91a0-dd94e95495fa-operator-scripts\") pod \"aodh-84b9-account-create-update-zttlk\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.599910 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9lwn\" (UniqueName: \"kubernetes.io/projected/e93aa97d-5839-45fb-91a0-dd94e95495fa-kube-api-access-z9lwn\") pod \"aodh-84b9-account-create-update-zttlk\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.600896 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e93aa97d-5839-45fb-91a0-dd94e95495fa-operator-scripts\") pod \"aodh-84b9-account-create-update-zttlk\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.617000 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9lwn\" (UniqueName: \"kubernetes.io/projected/e93aa97d-5839-45fb-91a0-dd94e95495fa-kube-api-access-z9lwn\") pod \"aodh-84b9-account-create-update-zttlk\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.709707 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:18 crc kubenswrapper[5127]: I0201 08:53:18.800945 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:19 crc kubenswrapper[5127]: I0201 08:53:19.341178 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-lcpbf"] Feb 01 08:53:19 crc kubenswrapper[5127]: W0201 08:53:19.354859 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a45274_c5ce_49fa_a520_d69f46c46b93.slice/crio-66caaad62ea00e2656b09b8365fa7d27a66683918f736a18a029319b9adfe87c WatchSource:0}: Error finding container 66caaad62ea00e2656b09b8365fa7d27a66683918f736a18a029319b9adfe87c: Status 404 returned error can't find the container with id 66caaad62ea00e2656b09b8365fa7d27a66683918f736a18a029319b9adfe87c Feb 01 08:53:19 crc kubenswrapper[5127]: I0201 08:53:19.430180 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-84b9-account-create-update-zttlk"] Feb 01 08:53:19 crc kubenswrapper[5127]: W0201 08:53:19.434851 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode93aa97d_5839_45fb_91a0_dd94e95495fa.slice/crio-33caa2467b97bd04c497c547559f80b76df18b544fe8e6df2378b3f79850dd70 WatchSource:0}: Error finding container 33caa2467b97bd04c497c547559f80b76df18b544fe8e6df2378b3f79850dd70: Status 404 returned error can't find the container with id 33caa2467b97bd04c497c547559f80b76df18b544fe8e6df2378b3f79850dd70 Feb 01 08:53:19 crc kubenswrapper[5127]: I0201 08:53:19.479146 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lcpbf" event={"ID":"b0a45274-c5ce-49fa-a520-d69f46c46b93","Type":"ContainerStarted","Data":"66caaad62ea00e2656b09b8365fa7d27a66683918f736a18a029319b9adfe87c"} Feb 01 08:53:19 crc kubenswrapper[5127]: I0201 08:53:19.481979 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-84b9-account-create-update-zttlk" event={"ID":"e93aa97d-5839-45fb-91a0-dd94e95495fa","Type":"ContainerStarted","Data":"33caa2467b97bd04c497c547559f80b76df18b544fe8e6df2378b3f79850dd70"} Feb 01 08:53:20 crc kubenswrapper[5127]: E0201 08:53:20.092333 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a45274_c5ce_49fa_a520_d69f46c46b93.slice/crio-conmon-605a99769321cade893991a0145c19b2ab7d6c8570067852615a0ad38846adc6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0a45274_c5ce_49fa_a520_d69f46c46b93.slice/crio-605a99769321cade893991a0145c19b2ab7d6c8570067852615a0ad38846adc6.scope\": RecentStats: unable to find data in memory cache]" Feb 01 08:53:20 crc kubenswrapper[5127]: I0201 08:53:20.491771 5127 generic.go:334] "Generic (PLEG): container finished" podID="e93aa97d-5839-45fb-91a0-dd94e95495fa" containerID="b5479f2e38ac4243bc5a0ea8dba1c792c416ea275a16c05b7a6eddcf799e8836" exitCode=0 Feb 01 08:53:20 crc kubenswrapper[5127]: I0201 08:53:20.491832 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-84b9-account-create-update-zttlk" event={"ID":"e93aa97d-5839-45fb-91a0-dd94e95495fa","Type":"ContainerDied","Data":"b5479f2e38ac4243bc5a0ea8dba1c792c416ea275a16c05b7a6eddcf799e8836"} Feb 01 08:53:20 crc kubenswrapper[5127]: I0201 08:53:20.493679 5127 generic.go:334] "Generic (PLEG): container finished" podID="b0a45274-c5ce-49fa-a520-d69f46c46b93" containerID="605a99769321cade893991a0145c19b2ab7d6c8570067852615a0ad38846adc6" exitCode=0 Feb 01 08:53:20 crc kubenswrapper[5127]: I0201 08:53:20.493719 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lcpbf" event={"ID":"b0a45274-c5ce-49fa-a520-d69f46c46b93","Type":"ContainerDied","Data":"605a99769321cade893991a0145c19b2ab7d6c8570067852615a0ad38846adc6"} Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.133447 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.140742 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.277702 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ck7j\" (UniqueName: \"kubernetes.io/projected/b0a45274-c5ce-49fa-a520-d69f46c46b93-kube-api-access-4ck7j\") pod \"b0a45274-c5ce-49fa-a520-d69f46c46b93\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.277879 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a45274-c5ce-49fa-a520-d69f46c46b93-operator-scripts\") pod \"b0a45274-c5ce-49fa-a520-d69f46c46b93\" (UID: \"b0a45274-c5ce-49fa-a520-d69f46c46b93\") " Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.277947 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9lwn\" (UniqueName: \"kubernetes.io/projected/e93aa97d-5839-45fb-91a0-dd94e95495fa-kube-api-access-z9lwn\") pod \"e93aa97d-5839-45fb-91a0-dd94e95495fa\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.278002 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e93aa97d-5839-45fb-91a0-dd94e95495fa-operator-scripts\") pod \"e93aa97d-5839-45fb-91a0-dd94e95495fa\" (UID: \"e93aa97d-5839-45fb-91a0-dd94e95495fa\") " Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.278516 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0a45274-c5ce-49fa-a520-d69f46c46b93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b0a45274-c5ce-49fa-a520-d69f46c46b93" (UID: "b0a45274-c5ce-49fa-a520-d69f46c46b93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.278682 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0a45274-c5ce-49fa-a520-d69f46c46b93-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.279185 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93aa97d-5839-45fb-91a0-dd94e95495fa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e93aa97d-5839-45fb-91a0-dd94e95495fa" (UID: "e93aa97d-5839-45fb-91a0-dd94e95495fa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.283937 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a45274-c5ce-49fa-a520-d69f46c46b93-kube-api-access-4ck7j" (OuterVolumeSpecName: "kube-api-access-4ck7j") pod "b0a45274-c5ce-49fa-a520-d69f46c46b93" (UID: "b0a45274-c5ce-49fa-a520-d69f46c46b93"). InnerVolumeSpecName "kube-api-access-4ck7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.284706 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93aa97d-5839-45fb-91a0-dd94e95495fa-kube-api-access-z9lwn" (OuterVolumeSpecName: "kube-api-access-z9lwn") pod "e93aa97d-5839-45fb-91a0-dd94e95495fa" (UID: "e93aa97d-5839-45fb-91a0-dd94e95495fa"). InnerVolumeSpecName "kube-api-access-z9lwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.381081 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e93aa97d-5839-45fb-91a0-dd94e95495fa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.381505 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ck7j\" (UniqueName: \"kubernetes.io/projected/b0a45274-c5ce-49fa-a520-d69f46c46b93-kube-api-access-4ck7j\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.381517 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9lwn\" (UniqueName: \"kubernetes.io/projected/e93aa97d-5839-45fb-91a0-dd94e95495fa-kube-api-access-z9lwn\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.519490 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-lcpbf" event={"ID":"b0a45274-c5ce-49fa-a520-d69f46c46b93","Type":"ContainerDied","Data":"66caaad62ea00e2656b09b8365fa7d27a66683918f736a18a029319b9adfe87c"} Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.519612 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66caaad62ea00e2656b09b8365fa7d27a66683918f736a18a029319b9adfe87c" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.519511 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-lcpbf" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.522653 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-84b9-account-create-update-zttlk" event={"ID":"e93aa97d-5839-45fb-91a0-dd94e95495fa","Type":"ContainerDied","Data":"33caa2467b97bd04c497c547559f80b76df18b544fe8e6df2378b3f79850dd70"} Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.522695 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-84b9-account-create-update-zttlk" Feb 01 08:53:22 crc kubenswrapper[5127]: I0201 08:53:22.522698 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33caa2467b97bd04c497c547559f80b76df18b544fe8e6df2378b3f79850dd70" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.761696 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-wlfqn"] Feb 01 08:53:23 crc kubenswrapper[5127]: E0201 08:53:23.762768 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a45274-c5ce-49fa-a520-d69f46c46b93" containerName="mariadb-database-create" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.762785 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a45274-c5ce-49fa-a520-d69f46c46b93" containerName="mariadb-database-create" Feb 01 08:53:23 crc kubenswrapper[5127]: E0201 08:53:23.762830 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93aa97d-5839-45fb-91a0-dd94e95495fa" containerName="mariadb-account-create-update" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.762838 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93aa97d-5839-45fb-91a0-dd94e95495fa" containerName="mariadb-account-create-update" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.763093 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93aa97d-5839-45fb-91a0-dd94e95495fa" containerName="mariadb-account-create-update" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.763114 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a45274-c5ce-49fa-a520-d69f46c46b93" containerName="mariadb-database-create" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.764159 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.767530 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-lwtrk" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.773793 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.773966 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-wlfqn"] Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.774548 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.775182 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.916977 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgb8f\" (UniqueName: \"kubernetes.io/projected/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-kube-api-access-pgb8f\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.917216 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-combined-ca-bundle\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.917358 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-config-data\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:23 crc kubenswrapper[5127]: I0201 08:53:23.917574 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-scripts\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.020063 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgb8f\" (UniqueName: \"kubernetes.io/projected/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-kube-api-access-pgb8f\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.020222 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-combined-ca-bundle\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.020328 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-config-data\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.020450 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-scripts\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.027452 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-scripts\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.029204 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-combined-ca-bundle\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.029783 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-config-data\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.047157 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgb8f\" (UniqueName: \"kubernetes.io/projected/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-kube-api-access-pgb8f\") pod \"aodh-db-sync-wlfqn\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.071336 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zfmhw"] Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.083105 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.088411 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zfmhw"] Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.236061 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:53:24 crc kubenswrapper[5127]: E0201 08:53:24.236764 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:53:24 crc kubenswrapper[5127]: I0201 08:53:24.256132 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1054ddd-1c98-4a90-a760-31b8044a68a3" path="/var/lib/kubelet/pods/e1054ddd-1c98-4a90-a760-31b8044a68a3/volumes" Feb 01 08:53:25 crc kubenswrapper[5127]: I0201 08:53:25.386974 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-wlfqn"] Feb 01 08:53:25 crc kubenswrapper[5127]: I0201 08:53:25.553133 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wlfqn" event={"ID":"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8","Type":"ContainerStarted","Data":"25a748218c1ac5a1846b62dcb9dbca2d04820e6a6cb67f876be2fca7de155d9d"} Feb 01 08:53:30 crc kubenswrapper[5127]: I0201 08:53:30.779553 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 01 08:53:31 crc kubenswrapper[5127]: I0201 08:53:31.635427 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wlfqn" event={"ID":"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8","Type":"ContainerStarted","Data":"3eb9d93708f4f0e8cc469dd1e54b48d11fb4eba439d7cf79f7aeb2d60c3064f5"} Feb 01 08:53:31 crc kubenswrapper[5127]: I0201 08:53:31.667323 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-wlfqn" podStartSLOduration=3.291508057 podStartE2EDuration="8.667282523s" podCreationTimestamp="2026-02-01 08:53:23 +0000 UTC" firstStartedPulling="2026-02-01 08:53:25.400091499 +0000 UTC m=+7555.885993862" lastFinishedPulling="2026-02-01 08:53:30.775865955 +0000 UTC m=+7561.261768328" observedRunningTime="2026-02-01 08:53:31.662826713 +0000 UTC m=+7562.148729076" watchObservedRunningTime="2026-02-01 08:53:31.667282523 +0000 UTC m=+7562.153184886" Feb 01 08:53:31 crc kubenswrapper[5127]: I0201 08:53:31.839617 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 08:53:33 crc kubenswrapper[5127]: I0201 08:53:33.658018 5127 generic.go:334] "Generic (PLEG): container finished" podID="cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" containerID="3eb9d93708f4f0e8cc469dd1e54b48d11fb4eba439d7cf79f7aeb2d60c3064f5" exitCode=0 Feb 01 08:53:33 crc kubenswrapper[5127]: I0201 08:53:33.658070 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wlfqn" event={"ID":"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8","Type":"ContainerDied","Data":"3eb9d93708f4f0e8cc469dd1e54b48d11fb4eba439d7cf79f7aeb2d60c3064f5"} Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.157047 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.314777 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgb8f\" (UniqueName: \"kubernetes.io/projected/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-kube-api-access-pgb8f\") pod \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.314860 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-combined-ca-bundle\") pod \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.314926 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-scripts\") pod \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.315191 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-config-data\") pod \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\" (UID: \"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8\") " Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.322842 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-kube-api-access-pgb8f" (OuterVolumeSpecName: "kube-api-access-pgb8f") pod "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" (UID: "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8"). InnerVolumeSpecName "kube-api-access-pgb8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.323190 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-scripts" (OuterVolumeSpecName: "scripts") pod "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" (UID: "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.354904 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-config-data" (OuterVolumeSpecName: "config-data") pod "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" (UID: "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.358211 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" (UID: "cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.418115 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.418200 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgb8f\" (UniqueName: \"kubernetes.io/projected/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-kube-api-access-pgb8f\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.418220 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.418235 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.694365 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-wlfqn" event={"ID":"cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8","Type":"ContainerDied","Data":"25a748218c1ac5a1846b62dcb9dbca2d04820e6a6cb67f876be2fca7de155d9d"} Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.694430 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25a748218c1ac5a1846b62dcb9dbca2d04820e6a6cb67f876be2fca7de155d9d" Feb 01 08:53:35 crc kubenswrapper[5127]: I0201 08:53:35.694440 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-wlfqn" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.500944 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 01 08:53:38 crc kubenswrapper[5127]: E0201 08:53:38.501992 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" containerName="aodh-db-sync" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.502008 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" containerName="aodh-db-sync" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.502244 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" containerName="aodh-db-sync" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.506038 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.509595 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-lwtrk" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.510290 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.510328 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.534005 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.693639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.693707 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-config-data\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.693733 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-scripts\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.693876 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjmm\" (UniqueName: \"kubernetes.io/projected/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-kube-api-access-srjmm\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.796030 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjmm\" (UniqueName: \"kubernetes.io/projected/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-kube-api-access-srjmm\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.796101 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.796127 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-config-data\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.796144 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-scripts\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.803407 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-scripts\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.804385 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-config-data\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.806635 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.816192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjmm\" (UniqueName: \"kubernetes.io/projected/0ef0879a-fd11-40c8-aabe-6e26c48c5b5f-kube-api-access-srjmm\") pod \"aodh-0\" (UID: \"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f\") " pod="openstack/aodh-0" Feb 01 08:53:38 crc kubenswrapper[5127]: I0201 08:53:38.831282 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.236048 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:53:39 crc kubenswrapper[5127]: E0201 08:53:39.236533 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.317060 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.735893 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f","Type":"ContainerStarted","Data":"57ed1310fd69ae0fa622da1235efd5b4560b8c3d1b4b167af15014c1fca099ae"} Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.994241 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.995590 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-notification-agent" containerID="cri-o://68e0e6dff98419c8db624beacab38238758bb0acfb839cbd1d2623f45684b74a" gracePeriod=30 Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.995555 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="sg-core" containerID="cri-o://bee0c921be2ee8396b0f6b5247af52d5007c1b2cacb9f22613676c520d165b0b" gracePeriod=30 Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.995626 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="proxy-httpd" containerID="cri-o://cab38501d75c76307aa9e3889abf1afc81c1433236c8c7e9b4f9ce70606a253f" gracePeriod=30 Feb 01 08:53:39 crc kubenswrapper[5127]: I0201 08:53:39.995436 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-central-agent" containerID="cri-o://0a7badc1a06a46ac9174424344c02d89f78cbdd3ce09bc80919713ee4ac9e823" gracePeriod=30 Feb 01 08:53:40 crc kubenswrapper[5127]: I0201 08:53:40.749802 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f","Type":"ContainerStarted","Data":"5074b8167981a15938d6547cf33593d76323e3ee5947fbb7574d0a4ce8d4f48e"} Feb 01 08:53:40 crc kubenswrapper[5127]: I0201 08:53:40.755180 5127 generic.go:334] "Generic (PLEG): container finished" podID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerID="cab38501d75c76307aa9e3889abf1afc81c1433236c8c7e9b4f9ce70606a253f" exitCode=0 Feb 01 08:53:40 crc kubenswrapper[5127]: I0201 08:53:40.755199 5127 generic.go:334] "Generic (PLEG): container finished" podID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerID="bee0c921be2ee8396b0f6b5247af52d5007c1b2cacb9f22613676c520d165b0b" exitCode=2 Feb 01 08:53:40 crc kubenswrapper[5127]: I0201 08:53:40.755210 5127 generic.go:334] "Generic (PLEG): container finished" podID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerID="0a7badc1a06a46ac9174424344c02d89f78cbdd3ce09bc80919713ee4ac9e823" exitCode=0 Feb 01 08:53:40 crc kubenswrapper[5127]: I0201 08:53:40.755225 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerDied","Data":"cab38501d75c76307aa9e3889abf1afc81c1433236c8c7e9b4f9ce70606a253f"} Feb 01 08:53:40 crc kubenswrapper[5127]: I0201 08:53:40.755241 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerDied","Data":"bee0c921be2ee8396b0f6b5247af52d5007c1b2cacb9f22613676c520d165b0b"} Feb 01 08:53:40 crc kubenswrapper[5127]: I0201 08:53:40.755251 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerDied","Data":"0a7badc1a06a46ac9174424344c02d89f78cbdd3ce09bc80919713ee4ac9e823"} Feb 01 08:53:41 crc kubenswrapper[5127]: I0201 08:53:41.766481 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f","Type":"ContainerStarted","Data":"24f7195993b49627a4daf7e65d70b14bdb5364824bff2d6a8a902c4cd8b9f8fc"} Feb 01 08:53:42 crc kubenswrapper[5127]: I0201 08:53:42.796259 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f","Type":"ContainerStarted","Data":"a3f9a6b14b3808e16e4cb5e057b58c75e025dddbc925ea9cb756143f2124dc82"} Feb 01 08:53:44 crc kubenswrapper[5127]: I0201 08:53:44.824536 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0ef0879a-fd11-40c8-aabe-6e26c48c5b5f","Type":"ContainerStarted","Data":"f5498049c04756a9895918e493d1265c4c56a58186f350d261762635ce673aa0"} Feb 01 08:53:44 crc kubenswrapper[5127]: I0201 08:53:44.867451 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.773830126 podStartE2EDuration="6.867430272s" podCreationTimestamp="2026-02-01 08:53:38 +0000 UTC" firstStartedPulling="2026-02-01 08:53:39.32062973 +0000 UTC m=+7569.806532093" lastFinishedPulling="2026-02-01 08:53:44.414229876 +0000 UTC m=+7574.900132239" observedRunningTime="2026-02-01 08:53:44.841717801 +0000 UTC m=+7575.327620184" watchObservedRunningTime="2026-02-01 08:53:44.867430272 +0000 UTC m=+7575.353332635" Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.842731 5127 generic.go:334] "Generic (PLEG): container finished" podID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerID="68e0e6dff98419c8db624beacab38238758bb0acfb839cbd1d2623f45684b74a" exitCode=0 Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.842814 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerDied","Data":"68e0e6dff98419c8db624beacab38238758bb0acfb839cbd1d2623f45684b74a"} Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.844096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6","Type":"ContainerDied","Data":"031ba0eb58b89f4e41c40d1ba38f199c86fef097b5f27cba3c7192e5d28b8f86"} Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.844113 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031ba0eb58b89f4e41c40d1ba38f199c86fef097b5f27cba3c7192e5d28b8f86" Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.877148 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.988092 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-run-httpd\") pod \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.988332 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-log-httpd\") pod \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.988402 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-sg-core-conf-yaml\") pod \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.988436 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-scripts\") pod \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.988556 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42v72\" (UniqueName: \"kubernetes.io/projected/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-kube-api-access-42v72\") pod \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.988608 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-combined-ca-bundle\") pod \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.988687 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-config-data\") pod \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\" (UID: \"ce6b9dce-87e1-4788-ad0f-62202c1d2ef6\") " Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.989834 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" (UID: "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.990072 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" (UID: "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.995701 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-kube-api-access-42v72" (OuterVolumeSpecName: "kube-api-access-42v72") pod "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" (UID: "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6"). InnerVolumeSpecName "kube-api-access-42v72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:53:45 crc kubenswrapper[5127]: I0201 08:53:45.996554 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-scripts" (OuterVolumeSpecName: "scripts") pod "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" (UID: "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.022975 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" (UID: "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.093287 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.093343 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.093352 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.093363 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.093372 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42v72\" (UniqueName: \"kubernetes.io/projected/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-kube-api-access-42v72\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.101602 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" (UID: "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.143323 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-config-data" (OuterVolumeSpecName: "config-data") pod "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" (UID: "ce6b9dce-87e1-4788-ad0f-62202c1d2ef6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.195035 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.195073 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.391382 5127 scope.go:117] "RemoveContainer" containerID="375cff144ce619c5ff16b256717ea8a57770301acf6afd3c5ae93616e45b2645" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.422628 5127 scope.go:117] "RemoveContainer" containerID="59b3fa772fcda5210fc68d8201644615fdf0ff3ec0ede819d2e89e94cc4ce57d" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.466792 5127 scope.go:117] "RemoveContainer" containerID="5f367f456f379e5ac4964bcdb7cd6a1a64f4e09e5b731f3b2200c87b925894b5" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.850911 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.881151 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.897488 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.909423 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:46 crc kubenswrapper[5127]: E0201 08:53:46.909912 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="proxy-httpd" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.909934 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="proxy-httpd" Feb 01 08:53:46 crc kubenswrapper[5127]: E0201 08:53:46.909974 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-central-agent" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.909983 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-central-agent" Feb 01 08:53:46 crc kubenswrapper[5127]: E0201 08:53:46.910022 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="sg-core" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.910031 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="sg-core" Feb 01 08:53:46 crc kubenswrapper[5127]: E0201 08:53:46.910043 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-notification-agent" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.910051 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-notification-agent" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.910260 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-central-agent" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.910298 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="proxy-httpd" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.910324 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="ceilometer-notification-agent" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.910342 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" containerName="sg-core" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.912528 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.915197 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.915642 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 08:53:46 crc kubenswrapper[5127]: I0201 08:53:46.939506 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.012218 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.012367 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-config-data\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.012433 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-scripts\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.012481 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-log-httpd\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.012771 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-run-httpd\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.012854 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jdx9\" (UniqueName: \"kubernetes.io/projected/68872d9f-9142-4dcd-8a9a-2294500a1f36-kube-api-access-9jdx9\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.012915 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.115052 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.115546 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-config-data\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.115663 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-scripts\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.115752 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-log-httpd\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.115955 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-run-httpd\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.116046 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jdx9\" (UniqueName: \"kubernetes.io/projected/68872d9f-9142-4dcd-8a9a-2294500a1f36-kube-api-access-9jdx9\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.116126 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.116260 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-log-httpd\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.116397 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-run-httpd\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.120441 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.120902 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-config-data\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.121472 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-scripts\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.128336 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.137339 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jdx9\" (UniqueName: \"kubernetes.io/projected/68872d9f-9142-4dcd-8a9a-2294500a1f36-kube-api-access-9jdx9\") pod \"ceilometer-0\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.233310 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.787209 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:53:47 crc kubenswrapper[5127]: I0201 08:53:47.863491 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerStarted","Data":"5b0a63b03c0c142507dd13ad1abf4630ef65ca0c86ed9e11419f5d8d567e2fcd"} Feb 01 08:53:48 crc kubenswrapper[5127]: I0201 08:53:48.246599 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce6b9dce-87e1-4788-ad0f-62202c1d2ef6" path="/var/lib/kubelet/pods/ce6b9dce-87e1-4788-ad0f-62202c1d2ef6/volumes" Feb 01 08:53:48 crc kubenswrapper[5127]: I0201 08:53:48.874838 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerStarted","Data":"42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d"} Feb 01 08:53:48 crc kubenswrapper[5127]: I0201 08:53:48.874886 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerStarted","Data":"5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc"} Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.401090 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-7s4ch"] Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.403077 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.413999 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-7s4ch"] Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.491735 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vpjlr"] Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.493777 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.549050 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpjlr"] Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.562202 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-85ba-account-create-update-ldkxf"] Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.563532 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.567838 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.572812 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b26bs\" (UniqueName: \"kubernetes.io/projected/41691139-f2d8-4a68-aa1e-205689c2b1c6-kube-api-access-b26bs\") pod \"manila-db-create-7s4ch\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.579443 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41691139-f2d8-4a68-aa1e-205689c2b1c6-operator-scripts\") pod \"manila-db-create-7s4ch\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.584754 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-85ba-account-create-update-ldkxf"] Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.681772 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41691139-f2d8-4a68-aa1e-205689c2b1c6-operator-scripts\") pod \"manila-db-create-7s4ch\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.682171 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-catalog-content\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.682244 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t89f\" (UniqueName: \"kubernetes.io/projected/4d40d048-a7c0-4857-a12e-359ef9814e7c-kube-api-access-9t89f\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.682278 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b26bs\" (UniqueName: \"kubernetes.io/projected/41691139-f2d8-4a68-aa1e-205689c2b1c6-kube-api-access-b26bs\") pod \"manila-db-create-7s4ch\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.682339 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d8e7f56-b3ce-4221-b098-ea7756476a7a-operator-scripts\") pod \"manila-85ba-account-create-update-ldkxf\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.682390 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-utilities\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.682436 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-894fk\" (UniqueName: \"kubernetes.io/projected/4d8e7f56-b3ce-4221-b098-ea7756476a7a-kube-api-access-894fk\") pod \"manila-85ba-account-create-update-ldkxf\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.683717 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41691139-f2d8-4a68-aa1e-205689c2b1c6-operator-scripts\") pod \"manila-db-create-7s4ch\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.704439 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b26bs\" (UniqueName: \"kubernetes.io/projected/41691139-f2d8-4a68-aa1e-205689c2b1c6-kube-api-access-b26bs\") pod \"manila-db-create-7s4ch\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.726374 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.783888 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-catalog-content\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.784248 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t89f\" (UniqueName: \"kubernetes.io/projected/4d40d048-a7c0-4857-a12e-359ef9814e7c-kube-api-access-9t89f\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.784406 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d8e7f56-b3ce-4221-b098-ea7756476a7a-operator-scripts\") pod \"manila-85ba-account-create-update-ldkxf\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.784542 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-utilities\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.784626 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-catalog-content\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.784784 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-894fk\" (UniqueName: \"kubernetes.io/projected/4d8e7f56-b3ce-4221-b098-ea7756476a7a-kube-api-access-894fk\") pod \"manila-85ba-account-create-update-ldkxf\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.785718 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d8e7f56-b3ce-4221-b098-ea7756476a7a-operator-scripts\") pod \"manila-85ba-account-create-update-ldkxf\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.785911 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-utilities\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.802978 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-894fk\" (UniqueName: \"kubernetes.io/projected/4d8e7f56-b3ce-4221-b098-ea7756476a7a-kube-api-access-894fk\") pod \"manila-85ba-account-create-update-ldkxf\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.807971 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t89f\" (UniqueName: \"kubernetes.io/projected/4d40d048-a7c0-4857-a12e-359ef9814e7c-kube-api-access-9t89f\") pod \"certified-operators-vpjlr\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.824151 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.891076 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:49 crc kubenswrapper[5127]: I0201 08:53:49.919747 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerStarted","Data":"a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc"} Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.323717 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-7s4ch"] Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.544823 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vpjlr"] Feb 01 08:53:50 crc kubenswrapper[5127]: W0201 08:53:50.552492 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d8e7f56_b3ce_4221_b098_ea7756476a7a.slice/crio-76b2964bf766b357c9a723eef1dac3805b98ada59aa0cc2a3345e047d4278f1f WatchSource:0}: Error finding container 76b2964bf766b357c9a723eef1dac3805b98ada59aa0cc2a3345e047d4278f1f: Status 404 returned error can't find the container with id 76b2964bf766b357c9a723eef1dac3805b98ada59aa0cc2a3345e047d4278f1f Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.571763 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-85ba-account-create-update-ldkxf"] Feb 01 08:53:50 crc kubenswrapper[5127]: E0201 08:53:50.917920 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d40d048_a7c0_4857_a12e_359ef9814e7c.slice/crio-309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929.scope\": RecentStats: unable to find data in memory cache]" Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.930645 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerID="309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929" exitCode=0 Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.930744 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpjlr" event={"ID":"4d40d048-a7c0-4857-a12e-359ef9814e7c","Type":"ContainerDied","Data":"309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929"} Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.930773 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpjlr" event={"ID":"4d40d048-a7c0-4857-a12e-359ef9814e7c","Type":"ContainerStarted","Data":"0168bfffc6b3f9eb1c72ac3e5ef1557a179cc48b67ba4233b6867a29dac17a01"} Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.932526 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-85ba-account-create-update-ldkxf" event={"ID":"4d8e7f56-b3ce-4221-b098-ea7756476a7a","Type":"ContainerStarted","Data":"4dabf030ade965dc040c3eebb7a9945d7443c99bb1f39990a449baa478fe0d05"} Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.932561 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-85ba-account-create-update-ldkxf" event={"ID":"4d8e7f56-b3ce-4221-b098-ea7756476a7a","Type":"ContainerStarted","Data":"76b2964bf766b357c9a723eef1dac3805b98ada59aa0cc2a3345e047d4278f1f"} Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.934763 5127 generic.go:334] "Generic (PLEG): container finished" podID="41691139-f2d8-4a68-aa1e-205689c2b1c6" containerID="f67e806745fad421813caa3beb970830980e8f74d85e3ccd70e669302975d63c" exitCode=0 Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.934798 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7s4ch" event={"ID":"41691139-f2d8-4a68-aa1e-205689c2b1c6","Type":"ContainerDied","Data":"f67e806745fad421813caa3beb970830980e8f74d85e3ccd70e669302975d63c"} Feb 01 08:53:50 crc kubenswrapper[5127]: I0201 08:53:50.934821 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7s4ch" event={"ID":"41691139-f2d8-4a68-aa1e-205689c2b1c6","Type":"ContainerStarted","Data":"a2e625fdbbe8c075ace30779d35ee9cbbb6d91e37d4854b2fb9e346682f81758"} Feb 01 08:53:51 crc kubenswrapper[5127]: I0201 08:53:51.012460 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-85ba-account-create-update-ldkxf" podStartSLOduration=2.012436745 podStartE2EDuration="2.012436745s" podCreationTimestamp="2026-02-01 08:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:53:51.010139103 +0000 UTC m=+7581.496041466" watchObservedRunningTime="2026-02-01 08:53:51.012436745 +0000 UTC m=+7581.498339108" Feb 01 08:53:51 crc kubenswrapper[5127]: I0201 08:53:51.945651 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerStarted","Data":"c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045"} Feb 01 08:53:51 crc kubenswrapper[5127]: I0201 08:53:51.946096 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 08:53:51 crc kubenswrapper[5127]: I0201 08:53:51.948139 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpjlr" event={"ID":"4d40d048-a7c0-4857-a12e-359ef9814e7c","Type":"ContainerStarted","Data":"8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3"} Feb 01 08:53:51 crc kubenswrapper[5127]: I0201 08:53:51.950381 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d8e7f56-b3ce-4221-b098-ea7756476a7a" containerID="4dabf030ade965dc040c3eebb7a9945d7443c99bb1f39990a449baa478fe0d05" exitCode=0 Feb 01 08:53:51 crc kubenswrapper[5127]: I0201 08:53:51.950622 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-85ba-account-create-update-ldkxf" event={"ID":"4d8e7f56-b3ce-4221-b098-ea7756476a7a","Type":"ContainerDied","Data":"4dabf030ade965dc040c3eebb7a9945d7443c99bb1f39990a449baa478fe0d05"} Feb 01 08:53:51 crc kubenswrapper[5127]: I0201 08:53:51.974604 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.226106987 podStartE2EDuration="5.974563733s" podCreationTimestamp="2026-02-01 08:53:46 +0000 UTC" firstStartedPulling="2026-02-01 08:53:47.795240571 +0000 UTC m=+7578.281142934" lastFinishedPulling="2026-02-01 08:53:51.543697277 +0000 UTC m=+7582.029599680" observedRunningTime="2026-02-01 08:53:51.967335308 +0000 UTC m=+7582.453237691" watchObservedRunningTime="2026-02-01 08:53:51.974563733 +0000 UTC m=+7582.460466096" Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.389475 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.469467 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41691139-f2d8-4a68-aa1e-205689c2b1c6-operator-scripts\") pod \"41691139-f2d8-4a68-aa1e-205689c2b1c6\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.469635 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b26bs\" (UniqueName: \"kubernetes.io/projected/41691139-f2d8-4a68-aa1e-205689c2b1c6-kube-api-access-b26bs\") pod \"41691139-f2d8-4a68-aa1e-205689c2b1c6\" (UID: \"41691139-f2d8-4a68-aa1e-205689c2b1c6\") " Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.469979 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41691139-f2d8-4a68-aa1e-205689c2b1c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41691139-f2d8-4a68-aa1e-205689c2b1c6" (UID: "41691139-f2d8-4a68-aa1e-205689c2b1c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.490244 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41691139-f2d8-4a68-aa1e-205689c2b1c6-kube-api-access-b26bs" (OuterVolumeSpecName: "kube-api-access-b26bs") pod "41691139-f2d8-4a68-aa1e-205689c2b1c6" (UID: "41691139-f2d8-4a68-aa1e-205689c2b1c6"). InnerVolumeSpecName "kube-api-access-b26bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.572229 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b26bs\" (UniqueName: \"kubernetes.io/projected/41691139-f2d8-4a68-aa1e-205689c2b1c6-kube-api-access-b26bs\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.572446 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41691139-f2d8-4a68-aa1e-205689c2b1c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.974249 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-7s4ch" event={"ID":"41691139-f2d8-4a68-aa1e-205689c2b1c6","Type":"ContainerDied","Data":"a2e625fdbbe8c075ace30779d35ee9cbbb6d91e37d4854b2fb9e346682f81758"} Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.974686 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2e625fdbbe8c075ace30779d35ee9cbbb6d91e37d4854b2fb9e346682f81758" Feb 01 08:53:52 crc kubenswrapper[5127]: I0201 08:53:52.974406 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-7s4ch" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.237422 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:53:53 crc kubenswrapper[5127]: E0201 08:53:53.237693 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.557387 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.620068 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-894fk\" (UniqueName: \"kubernetes.io/projected/4d8e7f56-b3ce-4221-b098-ea7756476a7a-kube-api-access-894fk\") pod \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.620689 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d8e7f56-b3ce-4221-b098-ea7756476a7a-operator-scripts\") pod \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\" (UID: \"4d8e7f56-b3ce-4221-b098-ea7756476a7a\") " Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.621398 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d8e7f56-b3ce-4221-b098-ea7756476a7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d8e7f56-b3ce-4221-b098-ea7756476a7a" (UID: "4d8e7f56-b3ce-4221-b098-ea7756476a7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.622002 5127 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d8e7f56-b3ce-4221-b098-ea7756476a7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.625838 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8e7f56-b3ce-4221-b098-ea7756476a7a-kube-api-access-894fk" (OuterVolumeSpecName: "kube-api-access-894fk") pod "4d8e7f56-b3ce-4221-b098-ea7756476a7a" (UID: "4d8e7f56-b3ce-4221-b098-ea7756476a7a"). InnerVolumeSpecName "kube-api-access-894fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.724164 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-894fk\" (UniqueName: \"kubernetes.io/projected/4d8e7f56-b3ce-4221-b098-ea7756476a7a-kube-api-access-894fk\") on node \"crc\" DevicePath \"\"" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.988009 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerID="8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3" exitCode=0 Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.988056 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpjlr" event={"ID":"4d40d048-a7c0-4857-a12e-359ef9814e7c","Type":"ContainerDied","Data":"8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3"} Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.991222 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-85ba-account-create-update-ldkxf" event={"ID":"4d8e7f56-b3ce-4221-b098-ea7756476a7a","Type":"ContainerDied","Data":"76b2964bf766b357c9a723eef1dac3805b98ada59aa0cc2a3345e047d4278f1f"} Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.991272 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76b2964bf766b357c9a723eef1dac3805b98ada59aa0cc2a3345e047d4278f1f" Feb 01 08:53:53 crc kubenswrapper[5127]: I0201 08:53:53.991241 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-85ba-account-create-update-ldkxf" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.053753 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a754-account-create-update-4tbqd"] Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.066318 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a754-account-create-update-4tbqd"] Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.080227 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-lhrxg"] Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.099466 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-lhrxg"] Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.248331 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="289df401-c7a9-4098-95d7-94dd5affe406" path="/var/lib/kubelet/pods/289df401-c7a9-4098-95d7-94dd5affe406/volumes" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.248893 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aecbfca9-cc14-4c56-8205-84f9e4f96b6e" path="/var/lib/kubelet/pods/aecbfca9-cc14-4c56-8205-84f9e4f96b6e/volumes" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.942703 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-vjdm5"] Feb 01 08:53:54 crc kubenswrapper[5127]: E0201 08:53:54.943386 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41691139-f2d8-4a68-aa1e-205689c2b1c6" containerName="mariadb-database-create" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.943403 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="41691139-f2d8-4a68-aa1e-205689c2b1c6" containerName="mariadb-database-create" Feb 01 08:53:54 crc kubenswrapper[5127]: E0201 08:53:54.943440 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8e7f56-b3ce-4221-b098-ea7756476a7a" containerName="mariadb-account-create-update" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.943447 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8e7f56-b3ce-4221-b098-ea7756476a7a" containerName="mariadb-account-create-update" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.943658 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8e7f56-b3ce-4221-b098-ea7756476a7a" containerName="mariadb-account-create-update" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.944975 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="41691139-f2d8-4a68-aa1e-205689c2b1c6" containerName="mariadb-database-create" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.945794 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.949388 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-rc22r" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.949431 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 01 08:53:54 crc kubenswrapper[5127]: I0201 08:53:54.959427 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-vjdm5"] Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.018615 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpjlr" event={"ID":"4d40d048-a7c0-4857-a12e-359ef9814e7c","Type":"ContainerStarted","Data":"606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321"} Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.048979 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prsgt\" (UniqueName: \"kubernetes.io/projected/5802b898-6911-41ee-911f-c63f88207d79-kube-api-access-prsgt\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.049184 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-config-data\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.049326 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-job-config-data\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.049420 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-combined-ca-bundle\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.050573 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vpjlr" podStartSLOduration=2.566725706 podStartE2EDuration="6.050551773s" podCreationTimestamp="2026-02-01 08:53:49 +0000 UTC" firstStartedPulling="2026-02-01 08:53:50.934180722 +0000 UTC m=+7581.420083085" lastFinishedPulling="2026-02-01 08:53:54.418006749 +0000 UTC m=+7584.903909152" observedRunningTime="2026-02-01 08:53:55.036257789 +0000 UTC m=+7585.522160152" watchObservedRunningTime="2026-02-01 08:53:55.050551773 +0000 UTC m=+7585.536454136" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.151099 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-job-config-data\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.151468 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-combined-ca-bundle\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.151592 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prsgt\" (UniqueName: \"kubernetes.io/projected/5802b898-6911-41ee-911f-c63f88207d79-kube-api-access-prsgt\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.151634 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-config-data\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.157295 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-job-config-data\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.157790 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-combined-ca-bundle\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.158410 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-config-data\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.182295 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prsgt\" (UniqueName: \"kubernetes.io/projected/5802b898-6911-41ee-911f-c63f88207d79-kube-api-access-prsgt\") pod \"manila-db-sync-vjdm5\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:55 crc kubenswrapper[5127]: I0201 08:53:55.264087 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vjdm5" Feb 01 08:53:56 crc kubenswrapper[5127]: I0201 08:53:56.143893 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-vjdm5"] Feb 01 08:53:57 crc kubenswrapper[5127]: I0201 08:53:57.043940 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vjdm5" event={"ID":"5802b898-6911-41ee-911f-c63f88207d79","Type":"ContainerStarted","Data":"12e2becabe3e054f9d3074b5a6f98e0171874b68748bb8e79a5101402399ae5d"} Feb 01 08:53:59 crc kubenswrapper[5127]: I0201 08:53:59.825061 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:59 crc kubenswrapper[5127]: I0201 08:53:59.825401 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:53:59 crc kubenswrapper[5127]: I0201 08:53:59.873446 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:54:00 crc kubenswrapper[5127]: I0201 08:54:00.129167 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:54:01 crc kubenswrapper[5127]: I0201 08:54:01.241055 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vpjlr"] Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.105578 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vpjlr" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="registry-server" containerID="cri-o://606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321" gracePeriod=2 Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.730348 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.837630 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-catalog-content\") pod \"4d40d048-a7c0-4857-a12e-359ef9814e7c\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.837862 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t89f\" (UniqueName: \"kubernetes.io/projected/4d40d048-a7c0-4857-a12e-359ef9814e7c-kube-api-access-9t89f\") pod \"4d40d048-a7c0-4857-a12e-359ef9814e7c\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.837913 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-utilities\") pod \"4d40d048-a7c0-4857-a12e-359ef9814e7c\" (UID: \"4d40d048-a7c0-4857-a12e-359ef9814e7c\") " Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.838878 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-utilities" (OuterVolumeSpecName: "utilities") pod "4d40d048-a7c0-4857-a12e-359ef9814e7c" (UID: "4d40d048-a7c0-4857-a12e-359ef9814e7c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.850995 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d40d048-a7c0-4857-a12e-359ef9814e7c-kube-api-access-9t89f" (OuterVolumeSpecName: "kube-api-access-9t89f") pod "4d40d048-a7c0-4857-a12e-359ef9814e7c" (UID: "4d40d048-a7c0-4857-a12e-359ef9814e7c"). InnerVolumeSpecName "kube-api-access-9t89f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.906806 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d40d048-a7c0-4857-a12e-359ef9814e7c" (UID: "4d40d048-a7c0-4857-a12e-359ef9814e7c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.940455 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t89f\" (UniqueName: \"kubernetes.io/projected/4d40d048-a7c0-4857-a12e-359ef9814e7c-kube-api-access-9t89f\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.940495 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:02 crc kubenswrapper[5127]: I0201 08:54:02.940509 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d40d048-a7c0-4857-a12e-359ef9814e7c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.043730 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-5bzw7"] Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.057506 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-5bzw7"] Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.128775 5127 generic.go:334] "Generic (PLEG): container finished" podID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerID="606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321" exitCode=0 Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.128869 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vpjlr" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.128871 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpjlr" event={"ID":"4d40d048-a7c0-4857-a12e-359ef9814e7c","Type":"ContainerDied","Data":"606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321"} Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.129003 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vpjlr" event={"ID":"4d40d048-a7c0-4857-a12e-359ef9814e7c","Type":"ContainerDied","Data":"0168bfffc6b3f9eb1c72ac3e5ef1557a179cc48b67ba4233b6867a29dac17a01"} Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.129027 5127 scope.go:117] "RemoveContainer" containerID="606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.131532 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vjdm5" event={"ID":"5802b898-6911-41ee-911f-c63f88207d79","Type":"ContainerStarted","Data":"41118b7e5a1f623763847f62d37dababa60c350faf1cb65916f3180f27d57dd6"} Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.162632 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-vjdm5" podStartSLOduration=3.727288786 podStartE2EDuration="9.162609332s" podCreationTimestamp="2026-02-01 08:53:54 +0000 UTC" firstStartedPulling="2026-02-01 08:53:56.145064178 +0000 UTC m=+7586.630966541" lastFinishedPulling="2026-02-01 08:54:01.580384694 +0000 UTC m=+7592.066287087" observedRunningTime="2026-02-01 08:54:03.156699604 +0000 UTC m=+7593.642601967" watchObservedRunningTime="2026-02-01 08:54:03.162609332 +0000 UTC m=+7593.648511705" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.171535 5127 scope.go:117] "RemoveContainer" containerID="8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.189983 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vpjlr"] Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.197849 5127 scope.go:117] "RemoveContainer" containerID="309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.202377 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vpjlr"] Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.237491 5127 scope.go:117] "RemoveContainer" containerID="606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321" Feb 01 08:54:03 crc kubenswrapper[5127]: E0201 08:54:03.238011 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321\": container with ID starting with 606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321 not found: ID does not exist" containerID="606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.238049 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321"} err="failed to get container status \"606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321\": rpc error: code = NotFound desc = could not find container \"606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321\": container with ID starting with 606a0af5e6bcf19cb1195d9debb348b17cb4b405203bf7aa26fc033cea62e321 not found: ID does not exist" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.238070 5127 scope.go:117] "RemoveContainer" containerID="8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3" Feb 01 08:54:03 crc kubenswrapper[5127]: E0201 08:54:03.238537 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3\": container with ID starting with 8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3 not found: ID does not exist" containerID="8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.238592 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3"} err="failed to get container status \"8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3\": rpc error: code = NotFound desc = could not find container \"8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3\": container with ID starting with 8038e2361fb0c33a66b002883feac88b261a6d40d2dde11a0910df0ec2e5ffb3 not found: ID does not exist" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.238623 5127 scope.go:117] "RemoveContainer" containerID="309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929" Feb 01 08:54:03 crc kubenswrapper[5127]: E0201 08:54:03.238890 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929\": container with ID starting with 309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929 not found: ID does not exist" containerID="309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929" Feb 01 08:54:03 crc kubenswrapper[5127]: I0201 08:54:03.238910 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929"} err="failed to get container status \"309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929\": rpc error: code = NotFound desc = could not find container \"309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929\": container with ID starting with 309418d13e9d1d5510c87aa44b80760546273b89b51ee03e7c75f26f84a97929 not found: ID does not exist" Feb 01 08:54:04 crc kubenswrapper[5127]: I0201 08:54:04.143906 5127 generic.go:334] "Generic (PLEG): container finished" podID="5802b898-6911-41ee-911f-c63f88207d79" containerID="41118b7e5a1f623763847f62d37dababa60c350faf1cb65916f3180f27d57dd6" exitCode=0 Feb 01 08:54:04 crc kubenswrapper[5127]: I0201 08:54:04.144131 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vjdm5" event={"ID":"5802b898-6911-41ee-911f-c63f88207d79","Type":"ContainerDied","Data":"41118b7e5a1f623763847f62d37dababa60c350faf1cb65916f3180f27d57dd6"} Feb 01 08:54:04 crc kubenswrapper[5127]: I0201 08:54:04.255384 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" path="/var/lib/kubelet/pods/4d40d048-a7c0-4857-a12e-359ef9814e7c/volumes" Feb 01 08:54:04 crc kubenswrapper[5127]: I0201 08:54:04.257427 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21d88c9-c7f4-49dc-ae5d-cdae33667b35" path="/var/lib/kubelet/pods/a21d88c9-c7f4-49dc-ae5d-cdae33667b35/volumes" Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.729492 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vjdm5" Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.913088 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-job-config-data\") pod \"5802b898-6911-41ee-911f-c63f88207d79\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.913162 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-config-data\") pod \"5802b898-6911-41ee-911f-c63f88207d79\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.913351 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prsgt\" (UniqueName: \"kubernetes.io/projected/5802b898-6911-41ee-911f-c63f88207d79-kube-api-access-prsgt\") pod \"5802b898-6911-41ee-911f-c63f88207d79\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.913507 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-combined-ca-bundle\") pod \"5802b898-6911-41ee-911f-c63f88207d79\" (UID: \"5802b898-6911-41ee-911f-c63f88207d79\") " Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.918668 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5802b898-6911-41ee-911f-c63f88207d79-kube-api-access-prsgt" (OuterVolumeSpecName: "kube-api-access-prsgt") pod "5802b898-6911-41ee-911f-c63f88207d79" (UID: "5802b898-6911-41ee-911f-c63f88207d79"). InnerVolumeSpecName "kube-api-access-prsgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.919373 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "5802b898-6911-41ee-911f-c63f88207d79" (UID: "5802b898-6911-41ee-911f-c63f88207d79"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.921494 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-config-data" (OuterVolumeSpecName: "config-data") pod "5802b898-6911-41ee-911f-c63f88207d79" (UID: "5802b898-6911-41ee-911f-c63f88207d79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:54:05 crc kubenswrapper[5127]: I0201 08:54:05.942312 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5802b898-6911-41ee-911f-c63f88207d79" (UID: "5802b898-6911-41ee-911f-c63f88207d79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.016172 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.016205 5127 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.016217 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5802b898-6911-41ee-911f-c63f88207d79-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.016226 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prsgt\" (UniqueName: \"kubernetes.io/projected/5802b898-6911-41ee-911f-c63f88207d79-kube-api-access-prsgt\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.171878 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-vjdm5" event={"ID":"5802b898-6911-41ee-911f-c63f88207d79","Type":"ContainerDied","Data":"12e2becabe3e054f9d3074b5a6f98e0171874b68748bb8e79a5101402399ae5d"} Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.171926 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e2becabe3e054f9d3074b5a6f98e0171874b68748bb8e79a5101402399ae5d" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.171992 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-vjdm5" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.671439 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 01 08:54:06 crc kubenswrapper[5127]: E0201 08:54:06.672004 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="extract-content" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.672028 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="extract-content" Feb 01 08:54:06 crc kubenswrapper[5127]: E0201 08:54:06.672046 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="registry-server" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.672056 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="registry-server" Feb 01 08:54:06 crc kubenswrapper[5127]: E0201 08:54:06.672099 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5802b898-6911-41ee-911f-c63f88207d79" containerName="manila-db-sync" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.672108 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5802b898-6911-41ee-911f-c63f88207d79" containerName="manila-db-sync" Feb 01 08:54:06 crc kubenswrapper[5127]: E0201 08:54:06.672127 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="extract-utilities" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.672136 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="extract-utilities" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.672416 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5802b898-6911-41ee-911f-c63f88207d79" containerName="manila-db-sync" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.672458 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d40d048-a7c0-4857-a12e-359ef9814e7c" containerName="registry-server" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.675330 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.679421 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.679743 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.680073 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.680494 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-rc22r" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.686479 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.697616 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.699554 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.707924 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.708319 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.825621 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-684dd6949-dp77r"] Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.827779 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.835854 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.835919 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-scripts\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.835965 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.835991 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-config-data\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836038 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/63c4b3fc-9fd2-4bee-af21-71a7616cf171-ceph\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836055 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/63c4b3fc-9fd2-4bee-af21-71a7616cf171-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836074 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63c4b3fc-9fd2-4bee-af21-71a7616cf171-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836091 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836124 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-config-data\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836138 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836154 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836194 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff96x\" (UniqueName: \"kubernetes.io/projected/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-kube-api-access-ff96x\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836230 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8cf\" (UniqueName: \"kubernetes.io/projected/63c4b3fc-9fd2-4bee-af21-71a7616cf171-kube-api-access-xg8cf\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.836262 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-scripts\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.840899 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684dd6949-dp77r"] Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.938528 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8cf\" (UniqueName: \"kubernetes.io/projected/63c4b3fc-9fd2-4bee-af21-71a7616cf171-kube-api-access-xg8cf\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.938993 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-dns-svc\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939027 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-scripts\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939094 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939137 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-scripts\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939167 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-sb\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939214 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939247 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-config-data\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939293 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/63c4b3fc-9fd2-4bee-af21-71a7616cf171-ceph\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939314 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/63c4b3fc-9fd2-4bee-af21-71a7616cf171-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939339 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z2rd\" (UniqueName: \"kubernetes.io/projected/43d1088e-bbd9-4106-8769-d701862471c0-kube-api-access-8z2rd\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939365 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63c4b3fc-9fd2-4bee-af21-71a7616cf171-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939389 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939415 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-nb\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939453 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-config-data\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939473 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939496 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939537 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-config\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.939563 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff96x\" (UniqueName: \"kubernetes.io/projected/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-kube-api-access-ff96x\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.943112 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/63c4b3fc-9fd2-4bee-af21-71a7616cf171-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.949009 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-scripts\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.949023 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.949041 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/63c4b3fc-9fd2-4bee-af21-71a7616cf171-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.949453 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-config-data\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.949467 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.950029 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.950474 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.951862 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.951898 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/63c4b3fc-9fd2-4bee-af21-71a7616cf171-ceph\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.955396 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.959748 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.963638 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.966519 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8cf\" (UniqueName: \"kubernetes.io/projected/63c4b3fc-9fd2-4bee-af21-71a7616cf171-kube-api-access-xg8cf\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.966841 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff96x\" (UniqueName: \"kubernetes.io/projected/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-kube-api-access-ff96x\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.966918 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8066ea-58ff-4b2a-84ac-164dcf7197ff-config-data\") pod \"manila-scheduler-0\" (UID: \"7d8066ea-58ff-4b2a-84ac-164dcf7197ff\") " pod="openstack/manila-scheduler-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.967561 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:06 crc kubenswrapper[5127]: I0201 08:54:06.972237 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63c4b3fc-9fd2-4bee-af21-71a7616cf171-scripts\") pod \"manila-share-share1-0\" (UID: \"63c4b3fc-9fd2-4bee-af21-71a7616cf171\") " pod="openstack/manila-share-share1-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.002283 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.028305 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.041637 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-config\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.041726 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-dns-svc\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.041830 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-sb\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.041908 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z2rd\" (UniqueName: \"kubernetes.io/projected/43d1088e-bbd9-4106-8769-d701862471c0-kube-api-access-8z2rd\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.041935 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-nb\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.042786 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-nb\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.043635 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-config\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.044046 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-sb\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.044459 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-dns-svc\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.064167 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z2rd\" (UniqueName: \"kubernetes.io/projected/43d1088e-bbd9-4106-8769-d701862471c0-kube-api-access-8z2rd\") pod \"dnsmasq-dns-684dd6949-dp77r\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.143430 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.143483 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-config-data\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.143504 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtz6p\" (UniqueName: \"kubernetes.io/projected/60a3e644-23c5-4c50-8a26-24c70e5701f1-kube-api-access-wtz6p\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.143529 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a3e644-23c5-4c50-8a26-24c70e5701f1-logs\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.143616 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a3e644-23c5-4c50-8a26-24c70e5701f1-etc-machine-id\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.143649 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-scripts\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.143666 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-config-data-custom\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.162704 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.238036 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:54:07 crc kubenswrapper[5127]: E0201 08:54:07.238972 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.245468 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.245510 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-config-data\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.245530 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtz6p\" (UniqueName: \"kubernetes.io/projected/60a3e644-23c5-4c50-8a26-24c70e5701f1-kube-api-access-wtz6p\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.245560 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a3e644-23c5-4c50-8a26-24c70e5701f1-logs\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.245639 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a3e644-23c5-4c50-8a26-24c70e5701f1-etc-machine-id\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.245666 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-scripts\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.245685 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-config-data-custom\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.250716 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/60a3e644-23c5-4c50-8a26-24c70e5701f1-etc-machine-id\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.251506 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.252693 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60a3e644-23c5-4c50-8a26-24c70e5701f1-logs\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.254424 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-config-data\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.255784 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-scripts\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.260215 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/60a3e644-23c5-4c50-8a26-24c70e5701f1-config-data-custom\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.265382 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtz6p\" (UniqueName: \"kubernetes.io/projected/60a3e644-23c5-4c50-8a26-24c70e5701f1-kube-api-access-wtz6p\") pod \"manila-api-0\" (UID: \"60a3e644-23c5-4c50-8a26-24c70e5701f1\") " pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.473769 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.770438 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.810483 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-684dd6949-dp77r"] Feb 01 08:54:07 crc kubenswrapper[5127]: W0201 08:54:07.810695 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43d1088e_bbd9_4106_8769_d701862471c0.slice/crio-b95f7838bf6022c7de5ed666f8e648d374c37e3680726efba1cba7fc20fa7068 WatchSource:0}: Error finding container b95f7838bf6022c7de5ed666f8e648d374c37e3680726efba1cba7fc20fa7068: Status 404 returned error can't find the container with id b95f7838bf6022c7de5ed666f8e648d374c37e3680726efba1cba7fc20fa7068 Feb 01 08:54:07 crc kubenswrapper[5127]: I0201 08:54:07.832686 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 01 08:54:07 crc kubenswrapper[5127]: W0201 08:54:07.835815 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d8066ea_58ff_4b2a_84ac_164dcf7197ff.slice/crio-e386526cd2dfc1b1a25822b6aeeb8bb1e1a1261d4db9ab96ffd044d152ec2a33 WatchSource:0}: Error finding container e386526cd2dfc1b1a25822b6aeeb8bb1e1a1261d4db9ab96ffd044d152ec2a33: Status 404 returned error can't find the container with id e386526cd2dfc1b1a25822b6aeeb8bb1e1a1261d4db9ab96ffd044d152ec2a33 Feb 01 08:54:08 crc kubenswrapper[5127]: I0201 08:54:08.175928 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 01 08:54:08 crc kubenswrapper[5127]: I0201 08:54:08.208961 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7d8066ea-58ff-4b2a-84ac-164dcf7197ff","Type":"ContainerStarted","Data":"e386526cd2dfc1b1a25822b6aeeb8bb1e1a1261d4db9ab96ffd044d152ec2a33"} Feb 01 08:54:08 crc kubenswrapper[5127]: I0201 08:54:08.212924 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684dd6949-dp77r" event={"ID":"43d1088e-bbd9-4106-8769-d701862471c0","Type":"ContainerStarted","Data":"4d515fb69e709ba445f842c483e5b2777d61d8d875886fe48215265000904fa1"} Feb 01 08:54:08 crc kubenswrapper[5127]: I0201 08:54:08.212986 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684dd6949-dp77r" event={"ID":"43d1088e-bbd9-4106-8769-d701862471c0","Type":"ContainerStarted","Data":"b95f7838bf6022c7de5ed666f8e648d374c37e3680726efba1cba7fc20fa7068"} Feb 01 08:54:08 crc kubenswrapper[5127]: I0201 08:54:08.223913 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"63c4b3fc-9fd2-4bee-af21-71a7616cf171","Type":"ContainerStarted","Data":"a855607b246d31d8ea0b4e7002c045d75936cf10b4c0b54cad34852e0f850d06"} Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.250061 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"60a3e644-23c5-4c50-8a26-24c70e5701f1","Type":"ContainerStarted","Data":"6e75506b9cf9c53592211ba07db053abce592d27a737d570d9cfcf1a33584c66"} Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.250675 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"60a3e644-23c5-4c50-8a26-24c70e5701f1","Type":"ContainerStarted","Data":"254d1be489c1dcfce9ebd972f2ee54eea39300978d0b6d2a9408d7449ec4c272"} Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.253277 5127 generic.go:334] "Generic (PLEG): container finished" podID="43d1088e-bbd9-4106-8769-d701862471c0" containerID="4d515fb69e709ba445f842c483e5b2777d61d8d875886fe48215265000904fa1" exitCode=0 Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.253331 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684dd6949-dp77r" event={"ID":"43d1088e-bbd9-4106-8769-d701862471c0","Type":"ContainerDied","Data":"4d515fb69e709ba445f842c483e5b2777d61d8d875886fe48215265000904fa1"} Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.253351 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684dd6949-dp77r" event={"ID":"43d1088e-bbd9-4106-8769-d701862471c0","Type":"ContainerStarted","Data":"7289210a878fdcf0a0d48ffc87d8cf0d2ae0998f431487dbcf98242ef30ab963"} Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.253455 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.258649 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7d8066ea-58ff-4b2a-84ac-164dcf7197ff","Type":"ContainerStarted","Data":"03c0629cdfb1d98779f250961720967ba4dd2976929a327f163c6a615835816e"} Feb 01 08:54:09 crc kubenswrapper[5127]: I0201 08:54:09.274574 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-684dd6949-dp77r" podStartSLOduration=3.274555147 podStartE2EDuration="3.274555147s" podCreationTimestamp="2026-02-01 08:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:54:09.27281655 +0000 UTC m=+7599.758718933" watchObservedRunningTime="2026-02-01 08:54:09.274555147 +0000 UTC m=+7599.760457510" Feb 01 08:54:10 crc kubenswrapper[5127]: I0201 08:54:10.311498 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"7d8066ea-58ff-4b2a-84ac-164dcf7197ff","Type":"ContainerStarted","Data":"94b2495354d3b5ee78b466eebf8eb27a47ed643bb0253b6ae4618970887a5461"} Feb 01 08:54:10 crc kubenswrapper[5127]: I0201 08:54:10.314018 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"60a3e644-23c5-4c50-8a26-24c70e5701f1","Type":"ContainerStarted","Data":"92a347e31cb7480cc707cec1089ca6438732fc1648ae588be863d8de8fe9180e"} Feb 01 08:54:10 crc kubenswrapper[5127]: I0201 08:54:10.335316 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.793711864 podStartE2EDuration="4.335297334s" podCreationTimestamp="2026-02-01 08:54:06 +0000 UTC" firstStartedPulling="2026-02-01 08:54:07.838798223 +0000 UTC m=+7598.324700596" lastFinishedPulling="2026-02-01 08:54:08.380383703 +0000 UTC m=+7598.866286066" observedRunningTime="2026-02-01 08:54:10.330921807 +0000 UTC m=+7600.816824180" watchObservedRunningTime="2026-02-01 08:54:10.335297334 +0000 UTC m=+7600.821199697" Feb 01 08:54:10 crc kubenswrapper[5127]: I0201 08:54:10.352411 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.352390693 podStartE2EDuration="4.352390693s" podCreationTimestamp="2026-02-01 08:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:54:10.349088695 +0000 UTC m=+7600.834991058" watchObservedRunningTime="2026-02-01 08:54:10.352390693 +0000 UTC m=+7600.838293056" Feb 01 08:54:11 crc kubenswrapper[5127]: I0201 08:54:11.332062 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 01 08:54:15 crc kubenswrapper[5127]: I0201 08:54:15.395703 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"63c4b3fc-9fd2-4bee-af21-71a7616cf171","Type":"ContainerStarted","Data":"aecd4b0fd4c9a79af6afadd53b4302f5afdf2d186a7d9859ae596b8effcf1d53"} Feb 01 08:54:16 crc kubenswrapper[5127]: I0201 08:54:16.409837 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"63c4b3fc-9fd2-4bee-af21-71a7616cf171","Type":"ContainerStarted","Data":"8f53925b458d466488a43054e3e7e9f482e65c6c3019ce0b73ef3a476655a36e"} Feb 01 08:54:16 crc kubenswrapper[5127]: I0201 08:54:16.444527 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.441801791 podStartE2EDuration="10.444502666s" podCreationTimestamp="2026-02-01 08:54:06 +0000 UTC" firstStartedPulling="2026-02-01 08:54:07.771517846 +0000 UTC m=+7598.257420209" lastFinishedPulling="2026-02-01 08:54:14.774218721 +0000 UTC m=+7605.260121084" observedRunningTime="2026-02-01 08:54:16.43646742 +0000 UTC m=+7606.922369833" watchObservedRunningTime="2026-02-01 08:54:16.444502666 +0000 UTC m=+7606.930405039" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.002645 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.030110 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.164875 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.253395 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.262750 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f67595cb7-ppcf5"] Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.263018 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" podUID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerName="dnsmasq-dns" containerID="cri-o://cc584ee8cdc0b72db97d721bfb579c2037f5ef64aa41be2f9fa027c723163f7b" gracePeriod=10 Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.427319 5127 generic.go:334] "Generic (PLEG): container finished" podID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerID="cc584ee8cdc0b72db97d721bfb579c2037f5ef64aa41be2f9fa027c723163f7b" exitCode=0 Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.427421 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" event={"ID":"c539054e-2748-4c21-ab77-e8720cbc02bf","Type":"ContainerDied","Data":"cc584ee8cdc0b72db97d721bfb579c2037f5ef64aa41be2f9fa027c723163f7b"} Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.847413 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.881440 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-nb\") pod \"c539054e-2748-4c21-ab77-e8720cbc02bf\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.881660 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-config\") pod \"c539054e-2748-4c21-ab77-e8720cbc02bf\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.881699 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-sb\") pod \"c539054e-2748-4c21-ab77-e8720cbc02bf\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.881735 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7jbq\" (UniqueName: \"kubernetes.io/projected/c539054e-2748-4c21-ab77-e8720cbc02bf-kube-api-access-v7jbq\") pod \"c539054e-2748-4c21-ab77-e8720cbc02bf\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.881849 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-dns-svc\") pod \"c539054e-2748-4c21-ab77-e8720cbc02bf\" (UID: \"c539054e-2748-4c21-ab77-e8720cbc02bf\") " Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.893880 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c539054e-2748-4c21-ab77-e8720cbc02bf-kube-api-access-v7jbq" (OuterVolumeSpecName: "kube-api-access-v7jbq") pod "c539054e-2748-4c21-ab77-e8720cbc02bf" (UID: "c539054e-2748-4c21-ab77-e8720cbc02bf"). InnerVolumeSpecName "kube-api-access-v7jbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.942612 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c539054e-2748-4c21-ab77-e8720cbc02bf" (UID: "c539054e-2748-4c21-ab77-e8720cbc02bf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.966475 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c539054e-2748-4c21-ab77-e8720cbc02bf" (UID: "c539054e-2748-4c21-ab77-e8720cbc02bf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.968470 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c539054e-2748-4c21-ab77-e8720cbc02bf" (UID: "c539054e-2748-4c21-ab77-e8720cbc02bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.978891 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-config" (OuterVolumeSpecName: "config") pod "c539054e-2748-4c21-ab77-e8720cbc02bf" (UID: "c539054e-2748-4c21-ab77-e8720cbc02bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.987489 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.987525 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.987538 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7jbq\" (UniqueName: \"kubernetes.io/projected/c539054e-2748-4c21-ab77-e8720cbc02bf-kube-api-access-v7jbq\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.987548 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:17 crc kubenswrapper[5127]: I0201 08:54:17.987556 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c539054e-2748-4c21-ab77-e8720cbc02bf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:18 crc kubenswrapper[5127]: I0201 08:54:18.442024 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" event={"ID":"c539054e-2748-4c21-ab77-e8720cbc02bf","Type":"ContainerDied","Data":"fed838cd03164e50dac93e2c13e5a78f1c395d9145ea07e08161acf0361d0188"} Feb 01 08:54:18 crc kubenswrapper[5127]: I0201 08:54:18.442529 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67595cb7-ppcf5" Feb 01 08:54:18 crc kubenswrapper[5127]: I0201 08:54:18.442862 5127 scope.go:117] "RemoveContainer" containerID="cc584ee8cdc0b72db97d721bfb579c2037f5ef64aa41be2f9fa027c723163f7b" Feb 01 08:54:18 crc kubenswrapper[5127]: I0201 08:54:18.475002 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f67595cb7-ppcf5"] Feb 01 08:54:18 crc kubenswrapper[5127]: I0201 08:54:18.479072 5127 scope.go:117] "RemoveContainer" containerID="c819be784ccfaf353fcd5d1ffea9892db74d63e90b791956c589d37f8c4b2faa" Feb 01 08:54:18 crc kubenswrapper[5127]: I0201 08:54:18.490171 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f67595cb7-ppcf5"] Feb 01 08:54:19 crc kubenswrapper[5127]: I0201 08:54:19.235440 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:54:19 crc kubenswrapper[5127]: E0201 08:54:19.236083 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:54:19 crc kubenswrapper[5127]: I0201 08:54:19.473642 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:54:19 crc kubenswrapper[5127]: I0201 08:54:19.474433 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-central-agent" containerID="cri-o://5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc" gracePeriod=30 Feb 01 08:54:19 crc kubenswrapper[5127]: I0201 08:54:19.475048 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="proxy-httpd" containerID="cri-o://c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045" gracePeriod=30 Feb 01 08:54:19 crc kubenswrapper[5127]: I0201 08:54:19.475136 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-notification-agent" containerID="cri-o://42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d" gracePeriod=30 Feb 01 08:54:19 crc kubenswrapper[5127]: I0201 08:54:19.475273 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="sg-core" containerID="cri-o://a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc" gracePeriod=30 Feb 01 08:54:20 crc kubenswrapper[5127]: I0201 08:54:20.252906 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c539054e-2748-4c21-ab77-e8720cbc02bf" path="/var/lib/kubelet/pods/c539054e-2748-4c21-ab77-e8720cbc02bf/volumes" Feb 01 08:54:20 crc kubenswrapper[5127]: I0201 08:54:20.465744 5127 generic.go:334] "Generic (PLEG): container finished" podID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerID="c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045" exitCode=0 Feb 01 08:54:20 crc kubenswrapper[5127]: I0201 08:54:20.465941 5127 generic.go:334] "Generic (PLEG): container finished" podID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerID="a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc" exitCode=2 Feb 01 08:54:20 crc kubenswrapper[5127]: I0201 08:54:20.466002 5127 generic.go:334] "Generic (PLEG): container finished" podID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerID="5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc" exitCode=0 Feb 01 08:54:20 crc kubenswrapper[5127]: I0201 08:54:20.465833 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerDied","Data":"c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045"} Feb 01 08:54:20 crc kubenswrapper[5127]: I0201 08:54:20.466159 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerDied","Data":"a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc"} Feb 01 08:54:20 crc kubenswrapper[5127]: I0201 08:54:20.466252 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerDied","Data":"5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc"} Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.101937 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.150686 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-log-httpd\") pod \"68872d9f-9142-4dcd-8a9a-2294500a1f36\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.150734 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-config-data\") pod \"68872d9f-9142-4dcd-8a9a-2294500a1f36\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.150757 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-combined-ca-bundle\") pod \"68872d9f-9142-4dcd-8a9a-2294500a1f36\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.150862 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jdx9\" (UniqueName: \"kubernetes.io/projected/68872d9f-9142-4dcd-8a9a-2294500a1f36-kube-api-access-9jdx9\") pod \"68872d9f-9142-4dcd-8a9a-2294500a1f36\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.150887 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-run-httpd\") pod \"68872d9f-9142-4dcd-8a9a-2294500a1f36\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.150910 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-sg-core-conf-yaml\") pod \"68872d9f-9142-4dcd-8a9a-2294500a1f36\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.150936 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-scripts\") pod \"68872d9f-9142-4dcd-8a9a-2294500a1f36\" (UID: \"68872d9f-9142-4dcd-8a9a-2294500a1f36\") " Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.159911 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68872d9f-9142-4dcd-8a9a-2294500a1f36" (UID: "68872d9f-9142-4dcd-8a9a-2294500a1f36"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.159840 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-scripts" (OuterVolumeSpecName: "scripts") pod "68872d9f-9142-4dcd-8a9a-2294500a1f36" (UID: "68872d9f-9142-4dcd-8a9a-2294500a1f36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.160371 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68872d9f-9142-4dcd-8a9a-2294500a1f36" (UID: "68872d9f-9142-4dcd-8a9a-2294500a1f36"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.165608 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68872d9f-9142-4dcd-8a9a-2294500a1f36-kube-api-access-9jdx9" (OuterVolumeSpecName: "kube-api-access-9jdx9") pod "68872d9f-9142-4dcd-8a9a-2294500a1f36" (UID: "68872d9f-9142-4dcd-8a9a-2294500a1f36"). InnerVolumeSpecName "kube-api-access-9jdx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.192742 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68872d9f-9142-4dcd-8a9a-2294500a1f36" (UID: "68872d9f-9142-4dcd-8a9a-2294500a1f36"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.253959 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jdx9\" (UniqueName: \"kubernetes.io/projected/68872d9f-9142-4dcd-8a9a-2294500a1f36-kube-api-access-9jdx9\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.253989 5127 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.254002 5127 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.254011 5127 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-scripts\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.254020 5127 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68872d9f-9142-4dcd-8a9a-2294500a1f36-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.255465 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68872d9f-9142-4dcd-8a9a-2294500a1f36" (UID: "68872d9f-9142-4dcd-8a9a-2294500a1f36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.263252 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-config-data" (OuterVolumeSpecName: "config-data") pod "68872d9f-9142-4dcd-8a9a-2294500a1f36" (UID: "68872d9f-9142-4dcd-8a9a-2294500a1f36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.356500 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.356541 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68872d9f-9142-4dcd-8a9a-2294500a1f36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.519040 5127 generic.go:334] "Generic (PLEG): container finished" podID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerID="42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d" exitCode=0 Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.519088 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerDied","Data":"42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d"} Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.519150 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68872d9f-9142-4dcd-8a9a-2294500a1f36","Type":"ContainerDied","Data":"5b0a63b03c0c142507dd13ad1abf4630ef65ca0c86ed9e11419f5d8d567e2fcd"} Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.519153 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.519181 5127 scope.go:117] "RemoveContainer" containerID="c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.549754 5127 scope.go:117] "RemoveContainer" containerID="a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.578332 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.601191 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.601847 5127 scope.go:117] "RemoveContainer" containerID="42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610135 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.610598 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerName="dnsmasq-dns" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610620 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerName="dnsmasq-dns" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.610632 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-central-agent" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610639 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-central-agent" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.610654 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="sg-core" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610660 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="sg-core" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.610676 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerName="init" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610682 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerName="init" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.610711 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="proxy-httpd" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610718 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="proxy-httpd" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.610729 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-notification-agent" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610734 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-notification-agent" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610916 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c539054e-2748-4c21-ab77-e8720cbc02bf" containerName="dnsmasq-dns" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610932 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="sg-core" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610948 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-notification-agent" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610957 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="proxy-httpd" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.610967 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" containerName="ceilometer-central-agent" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.612805 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.616454 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.616693 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.632778 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.664597 5127 scope.go:117] "RemoveContainer" containerID="5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.665322 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b56c932-5925-4fd1-b889-86e2d62a41ec-log-httpd\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.665358 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-scripts\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.665489 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.665606 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b56c932-5925-4fd1-b889-86e2d62a41ec-run-httpd\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.665654 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-config-data\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.665828 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.665997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkt8w\" (UniqueName: \"kubernetes.io/projected/0b56c932-5925-4fd1-b889-86e2d62a41ec-kube-api-access-mkt8w\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.694224 5127 scope.go:117] "RemoveContainer" containerID="c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.694686 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045\": container with ID starting with c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045 not found: ID does not exist" containerID="c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.694728 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045"} err="failed to get container status \"c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045\": rpc error: code = NotFound desc = could not find container \"c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045\": container with ID starting with c3f3073caea84619b9b2b92b8cbb13e5db5bcd03e15fa98c8139b2ec3726d045 not found: ID does not exist" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.694756 5127 scope.go:117] "RemoveContainer" containerID="a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.694993 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc\": container with ID starting with a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc not found: ID does not exist" containerID="a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.695017 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc"} err="failed to get container status \"a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc\": rpc error: code = NotFound desc = could not find container \"a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc\": container with ID starting with a7445eabc3eb309874292a3b833f45fc62efaf6dabe091eadfdacb94940c54cc not found: ID does not exist" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.695067 5127 scope.go:117] "RemoveContainer" containerID="42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.697948 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d\": container with ID starting with 42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d not found: ID does not exist" containerID="42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.697986 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d"} err="failed to get container status \"42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d\": rpc error: code = NotFound desc = could not find container \"42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d\": container with ID starting with 42ac83b9981ca58850b80213b1fdbbcf358e71e91169a94ae5158390c1d7460d not found: ID does not exist" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.698004 5127 scope.go:117] "RemoveContainer" containerID="5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc" Feb 01 08:54:25 crc kubenswrapper[5127]: E0201 08:54:25.698206 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc\": container with ID starting with 5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc not found: ID does not exist" containerID="5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.698233 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc"} err="failed to get container status \"5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc\": rpc error: code = NotFound desc = could not find container \"5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc\": container with ID starting with 5f3a82acd377d8f81aa49859e96d66f62ad17fdb499b1833e0318331bbd8a8dc not found: ID does not exist" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.767793 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b56c932-5925-4fd1-b889-86e2d62a41ec-run-httpd\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.767840 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-config-data\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.767883 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.767935 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkt8w\" (UniqueName: \"kubernetes.io/projected/0b56c932-5925-4fd1-b889-86e2d62a41ec-kube-api-access-mkt8w\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.767978 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b56c932-5925-4fd1-b889-86e2d62a41ec-log-httpd\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.767998 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-scripts\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.768085 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.768327 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b56c932-5925-4fd1-b889-86e2d62a41ec-run-httpd\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.768682 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0b56c932-5925-4fd1-b889-86e2d62a41ec-log-httpd\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.773381 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-config-data\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.774497 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.775194 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.775486 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b56c932-5925-4fd1-b889-86e2d62a41ec-scripts\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.788762 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkt8w\" (UniqueName: \"kubernetes.io/projected/0b56c932-5925-4fd1-b889-86e2d62a41ec-kube-api-access-mkt8w\") pod \"ceilometer-0\" (UID: \"0b56c932-5925-4fd1-b889-86e2d62a41ec\") " pod="openstack/ceilometer-0" Feb 01 08:54:25 crc kubenswrapper[5127]: I0201 08:54:25.957758 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 01 08:54:26 crc kubenswrapper[5127]: I0201 08:54:26.248035 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68872d9f-9142-4dcd-8a9a-2294500a1f36" path="/var/lib/kubelet/pods/68872d9f-9142-4dcd-8a9a-2294500a1f36/volumes" Feb 01 08:54:26 crc kubenswrapper[5127]: I0201 08:54:26.488734 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 01 08:54:26 crc kubenswrapper[5127]: W0201 08:54:26.491775 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b56c932_5925_4fd1_b889_86e2d62a41ec.slice/crio-227cbf111489d1ee52b8e7a0a9b9e11f2490ed4c8521d00ff8e5e8dedf382c2a WatchSource:0}: Error finding container 227cbf111489d1ee52b8e7a0a9b9e11f2490ed4c8521d00ff8e5e8dedf382c2a: Status 404 returned error can't find the container with id 227cbf111489d1ee52b8e7a0a9b9e11f2490ed4c8521d00ff8e5e8dedf382c2a Feb 01 08:54:26 crc kubenswrapper[5127]: I0201 08:54:26.495164 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 08:54:26 crc kubenswrapper[5127]: I0201 08:54:26.544965 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b56c932-5925-4fd1-b889-86e2d62a41ec","Type":"ContainerStarted","Data":"227cbf111489d1ee52b8e7a0a9b9e11f2490ed4c8521d00ff8e5e8dedf382c2a"} Feb 01 08:54:27 crc kubenswrapper[5127]: I0201 08:54:27.560979 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b56c932-5925-4fd1-b889-86e2d62a41ec","Type":"ContainerStarted","Data":"32d82add338b8aa903353eb2210dd4de69b711d7b762a0b924bd06b6b332a142"} Feb 01 08:54:27 crc kubenswrapper[5127]: I0201 08:54:27.561619 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b56c932-5925-4fd1-b889-86e2d62a41ec","Type":"ContainerStarted","Data":"d2e9210a00207d6610fcdf4637de6dfa9a177fe3071d70f8305e67d1f2658763"} Feb 01 08:54:28 crc kubenswrapper[5127]: I0201 08:54:28.547228 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 01 08:54:28 crc kubenswrapper[5127]: I0201 08:54:28.576190 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b56c932-5925-4fd1-b889-86e2d62a41ec","Type":"ContainerStarted","Data":"45971c0467bf7f5d0ec04fb41e0b051687d03037b847897d452dc53433e3e093"} Feb 01 08:54:28 crc kubenswrapper[5127]: I0201 08:54:28.672401 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 01 08:54:28 crc kubenswrapper[5127]: I0201 08:54:28.935649 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 01 08:54:30 crc kubenswrapper[5127]: I0201 08:54:30.597765 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0b56c932-5925-4fd1-b889-86e2d62a41ec","Type":"ContainerStarted","Data":"4fb71ba7a64b74076809c1e2d503d04d81de0590306ca55711d83e731f653e11"} Feb 01 08:54:30 crc kubenswrapper[5127]: I0201 08:54:30.598294 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 01 08:54:30 crc kubenswrapper[5127]: I0201 08:54:30.626035 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.833677494 podStartE2EDuration="5.626018078s" podCreationTimestamp="2026-02-01 08:54:25 +0000 UTC" firstStartedPulling="2026-02-01 08:54:26.494864681 +0000 UTC m=+7616.980767044" lastFinishedPulling="2026-02-01 08:54:30.287205265 +0000 UTC m=+7620.773107628" observedRunningTime="2026-02-01 08:54:30.617170321 +0000 UTC m=+7621.103072684" watchObservedRunningTime="2026-02-01 08:54:30.626018078 +0000 UTC m=+7621.111920441" Feb 01 08:54:34 crc kubenswrapper[5127]: I0201 08:54:34.236263 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:54:34 crc kubenswrapper[5127]: E0201 08:54:34.237264 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:54:46 crc kubenswrapper[5127]: I0201 08:54:46.581189 5127 scope.go:117] "RemoveContainer" containerID="7f147dda7528bb3dc8be11d16730d307b338c9a8f59c256dc7410d4e7b5ff399" Feb 01 08:54:46 crc kubenswrapper[5127]: I0201 08:54:46.616181 5127 scope.go:117] "RemoveContainer" containerID="396befc3a955dc91cf4ddc074ef30fd9ae7cfc913425f90feb960d086eec57ab" Feb 01 08:54:46 crc kubenswrapper[5127]: I0201 08:54:46.698925 5127 scope.go:117] "RemoveContainer" containerID="38f8d0d67d6df968d5c9947e6389b73c5bf01d304792008815da09ba7bd2d0fb" Feb 01 08:54:48 crc kubenswrapper[5127]: I0201 08:54:48.239769 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:54:48 crc kubenswrapper[5127]: E0201 08:54:48.240441 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:54:55 crc kubenswrapper[5127]: I0201 08:54:55.965998 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 01 08:55:00 crc kubenswrapper[5127]: I0201 08:55:00.244634 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:55:00 crc kubenswrapper[5127]: E0201 08:55:00.261270 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:55:03 crc kubenswrapper[5127]: I0201 08:55:03.056727 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-lgdpw"] Feb 01 08:55:03 crc kubenswrapper[5127]: I0201 08:55:03.070782 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-2xclr"] Feb 01 08:55:03 crc kubenswrapper[5127]: I0201 08:55:03.085834 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7s9tm"] Feb 01 08:55:03 crc kubenswrapper[5127]: I0201 08:55:03.095692 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7s9tm"] Feb 01 08:55:03 crc kubenswrapper[5127]: I0201 08:55:03.107655 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-lgdpw"] Feb 01 08:55:03 crc kubenswrapper[5127]: I0201 08:55:03.118371 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-2xclr"] Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.143804 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2cf8-account-create-update-qhmcg"] Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.163149 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2cf8-account-create-update-qhmcg"] Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.273425 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14185d1-b786-48b9-ad05-165029edace5" path="/var/lib/kubelet/pods/a14185d1-b786-48b9-ad05-165029edace5/volumes" Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.274159 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05ff55a-0cef-4b41-aa86-84f55b33de4c" path="/var/lib/kubelet/pods/e05ff55a-0cef-4b41-aa86-84f55b33de4c/volumes" Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.274825 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6f0fdf1-acde-4ef1-af94-afed7e87232c" path="/var/lib/kubelet/pods/e6f0fdf1-acde-4ef1-af94-afed7e87232c/volumes" Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.275469 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8550964-0e27-4d3e-a09c-3cad9587955f" path="/var/lib/kubelet/pods/f8550964-0e27-4d3e-a09c-3cad9587955f/volumes" Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.277836 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6cad-account-create-update-kbxdv"] Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.330610 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6cad-account-create-update-kbxdv"] Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.362057 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-dc19-account-create-update-4lrx5"] Feb 01 08:55:04 crc kubenswrapper[5127]: I0201 08:55:04.370802 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-dc19-account-create-update-4lrx5"] Feb 01 08:55:06 crc kubenswrapper[5127]: I0201 08:55:06.265182 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2928aa86-c015-4379-b5d5-2254d8ca6989" path="/var/lib/kubelet/pods/2928aa86-c015-4379-b5d5-2254d8ca6989/volumes" Feb 01 08:55:06 crc kubenswrapper[5127]: I0201 08:55:06.267104 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a" path="/var/lib/kubelet/pods/ea8879f4-10bb-47f7-a83e-d35d8c6b9f8a/volumes" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.095198 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c776fcfb5-57wwt"] Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.097973 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.101729 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.115174 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c776fcfb5-57wwt"] Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.235757 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:55:14 crc kubenswrapper[5127]: E0201 08:55:14.236107 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.252187 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-config\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.252245 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmm8h\" (UniqueName: \"kubernetes.io/projected/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-kube-api-access-bmm8h\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.252406 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-openstack-cell1\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.252473 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-dns-svc\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.252536 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.252753 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.354616 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.354825 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-config\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.354849 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmm8h\" (UniqueName: \"kubernetes.io/projected/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-kube-api-access-bmm8h\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.354881 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-openstack-cell1\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.354912 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-dns-svc\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.354937 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.355970 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.355987 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.356208 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-config\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.356304 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-dns-svc\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.356652 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-openstack-cell1\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.376445 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmm8h\" (UniqueName: \"kubernetes.io/projected/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-kube-api-access-bmm8h\") pod \"dnsmasq-dns-5c776fcfb5-57wwt\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.420480 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:14 crc kubenswrapper[5127]: I0201 08:55:14.927771 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c776fcfb5-57wwt"] Feb 01 08:55:15 crc kubenswrapper[5127]: I0201 08:55:15.140737 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" event={"ID":"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4","Type":"ContainerStarted","Data":"2501b1be6056f5fa3bb31fcaed3cc1c571765b6bef7ff9745e96f04988cb72fd"} Feb 01 08:55:16 crc kubenswrapper[5127]: I0201 08:55:16.152199 5127 generic.go:334] "Generic (PLEG): container finished" podID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerID="32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1" exitCode=0 Feb 01 08:55:16 crc kubenswrapper[5127]: I0201 08:55:16.152303 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" event={"ID":"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4","Type":"ContainerDied","Data":"32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1"} Feb 01 08:55:17 crc kubenswrapper[5127]: I0201 08:55:17.167221 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" event={"ID":"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4","Type":"ContainerStarted","Data":"d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6"} Feb 01 08:55:17 crc kubenswrapper[5127]: I0201 08:55:17.167743 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:17 crc kubenswrapper[5127]: I0201 08:55:17.194136 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" podStartSLOduration=3.194107336 podStartE2EDuration="3.194107336s" podCreationTimestamp="2026-02-01 08:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:55:17.192464191 +0000 UTC m=+7667.678366624" watchObservedRunningTime="2026-02-01 08:55:17.194107336 +0000 UTC m=+7667.680009709" Feb 01 08:55:22 crc kubenswrapper[5127]: I0201 08:55:22.041419 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvj6t"] Feb 01 08:55:22 crc kubenswrapper[5127]: I0201 08:55:22.057975 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvj6t"] Feb 01 08:55:22 crc kubenswrapper[5127]: I0201 08:55:22.269181 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f3fda8-50b7-40be-9546-3d0c11bb1896" path="/var/lib/kubelet/pods/f1f3fda8-50b7-40be-9546-3d0c11bb1896/volumes" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.421864 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.490096 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684dd6949-dp77r"] Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.490384 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-684dd6949-dp77r" podUID="43d1088e-bbd9-4106-8769-d701862471c0" containerName="dnsmasq-dns" containerID="cri-o://7289210a878fdcf0a0d48ffc87d8cf0d2ae0998f431487dbcf98242ef30ab963" gracePeriod=10 Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.817332 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86b4c6959-h82gd"] Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.820051 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.826033 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-networker" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.854149 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86b4c6959-h82gd"] Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.927691 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-config\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.927854 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-networker\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.927904 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-dns-svc\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.927936 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-cell1\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.927967 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-sb\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.928007 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r44m7\" (UniqueName: \"kubernetes.io/projected/e65f8db9-c312-4dac-a18c-50684c4a48d6-kube-api-access-r44m7\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.928056 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-nb\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:24 crc kubenswrapper[5127]: I0201 08:55:24.987309 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b4c6959-h82gd"] Feb 01 08:55:24 crc kubenswrapper[5127]: E0201 08:55:24.988123 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-r44m7 openstack-cell1 openstack-networker ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-86b4c6959-h82gd" podUID="e65f8db9-c312-4dac-a18c-50684c4a48d6" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.037348 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-config\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.037490 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-networker\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.037523 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-dns-svc\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.037551 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-cell1\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.037630 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-sb\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.037683 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r44m7\" (UniqueName: \"kubernetes.io/projected/e65f8db9-c312-4dac-a18c-50684c4a48d6-kube-api-access-r44m7\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.037749 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-nb\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.038732 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-nb\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.039564 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-cell1\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.039562 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-dns-svc\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.044232 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f4c47df59-sgbvx"] Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.046152 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.053073 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-config\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.053830 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-networker\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.054418 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-sb\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.077081 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4c47df59-sgbvx"] Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.096430 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r44m7\" (UniqueName: \"kubernetes.io/projected/e65f8db9-c312-4dac-a18c-50684c4a48d6-kube-api-access-r44m7\") pod \"dnsmasq-dns-86b4c6959-h82gd\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.140780 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-openstack-cell1\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.141141 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-openstack-networker\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.141230 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-config\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.141250 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.141268 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-dns-svc\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.141294 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b4jg\" (UniqueName: \"kubernetes.io/projected/357483ec-5a75-4748-b811-c05f14bf9753-kube-api-access-2b4jg\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.141345 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.242928 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-openstack-cell1\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.243019 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-openstack-networker\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.243093 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-config\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.243109 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.243128 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-dns-svc\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.243153 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b4jg\" (UniqueName: \"kubernetes.io/projected/357483ec-5a75-4748-b811-c05f14bf9753-kube-api-access-2b4jg\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.243203 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.244131 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.244360 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-openstack-cell1\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.244478 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-config\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.244799 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-openstack-networker\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.244994 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-dns-svc\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.245273 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/357483ec-5a75-4748-b811-c05f14bf9753-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.282715 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b4jg\" (UniqueName: \"kubernetes.io/projected/357483ec-5a75-4748-b811-c05f14bf9753-kube-api-access-2b4jg\") pod \"dnsmasq-dns-5f4c47df59-sgbvx\" (UID: \"357483ec-5a75-4748-b811-c05f14bf9753\") " pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.318547 5127 generic.go:334] "Generic (PLEG): container finished" podID="43d1088e-bbd9-4106-8769-d701862471c0" containerID="7289210a878fdcf0a0d48ffc87d8cf0d2ae0998f431487dbcf98242ef30ab963" exitCode=0 Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.318676 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.318954 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684dd6949-dp77r" event={"ID":"43d1088e-bbd9-4106-8769-d701862471c0","Type":"ContainerDied","Data":"7289210a878fdcf0a0d48ffc87d8cf0d2ae0998f431487dbcf98242ef30ab963"} Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.366190 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.373055 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446574 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-nb\") pod \"43d1088e-bbd9-4106-8769-d701862471c0\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446681 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-nb\") pod \"e65f8db9-c312-4dac-a18c-50684c4a48d6\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446714 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-sb\") pod \"43d1088e-bbd9-4106-8769-d701862471c0\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446778 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-cell1\") pod \"e65f8db9-c312-4dac-a18c-50684c4a48d6\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446841 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z2rd\" (UniqueName: \"kubernetes.io/projected/43d1088e-bbd9-4106-8769-d701862471c0-kube-api-access-8z2rd\") pod \"43d1088e-bbd9-4106-8769-d701862471c0\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446864 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-config\") pod \"43d1088e-bbd9-4106-8769-d701862471c0\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446931 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-networker\") pod \"e65f8db9-c312-4dac-a18c-50684c4a48d6\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.446972 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-config\") pod \"e65f8db9-c312-4dac-a18c-50684c4a48d6\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447072 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r44m7\" (UniqueName: \"kubernetes.io/projected/e65f8db9-c312-4dac-a18c-50684c4a48d6-kube-api-access-r44m7\") pod \"e65f8db9-c312-4dac-a18c-50684c4a48d6\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447130 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-dns-svc\") pod \"e65f8db9-c312-4dac-a18c-50684c4a48d6\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447215 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-dns-svc\") pod \"43d1088e-bbd9-4106-8769-d701862471c0\" (UID: \"43d1088e-bbd9-4106-8769-d701862471c0\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447289 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-sb\") pod \"e65f8db9-c312-4dac-a18c-50684c4a48d6\" (UID: \"e65f8db9-c312-4dac-a18c-50684c4a48d6\") " Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447507 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "e65f8db9-c312-4dac-a18c-50684c4a48d6" (UID: "e65f8db9-c312-4dac-a18c-50684c4a48d6"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447787 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e65f8db9-c312-4dac-a18c-50684c4a48d6" (UID: "e65f8db9-c312-4dac-a18c-50684c4a48d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447956 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.447968 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.448399 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e65f8db9-c312-4dac-a18c-50684c4a48d6" (UID: "e65f8db9-c312-4dac-a18c-50684c4a48d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.448607 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e65f8db9-c312-4dac-a18c-50684c4a48d6" (UID: "e65f8db9-c312-4dac-a18c-50684c4a48d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.449106 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-networker" (OuterVolumeSpecName: "openstack-networker") pod "e65f8db9-c312-4dac-a18c-50684c4a48d6" (UID: "e65f8db9-c312-4dac-a18c-50684c4a48d6"). InnerVolumeSpecName "openstack-networker". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.449314 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-config" (OuterVolumeSpecName: "config") pod "e65f8db9-c312-4dac-a18c-50684c4a48d6" (UID: "e65f8db9-c312-4dac-a18c-50684c4a48d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.458370 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65f8db9-c312-4dac-a18c-50684c4a48d6-kube-api-access-r44m7" (OuterVolumeSpecName: "kube-api-access-r44m7") pod "e65f8db9-c312-4dac-a18c-50684c4a48d6" (UID: "e65f8db9-c312-4dac-a18c-50684c4a48d6"). InnerVolumeSpecName "kube-api-access-r44m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.458837 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d1088e-bbd9-4106-8769-d701862471c0-kube-api-access-8z2rd" (OuterVolumeSpecName: "kube-api-access-8z2rd") pod "43d1088e-bbd9-4106-8769-d701862471c0" (UID: "43d1088e-bbd9-4106-8769-d701862471c0"). InnerVolumeSpecName "kube-api-access-8z2rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.470501 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.513108 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "43d1088e-bbd9-4106-8769-d701862471c0" (UID: "43d1088e-bbd9-4106-8769-d701862471c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.515303 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "43d1088e-bbd9-4106-8769-d701862471c0" (UID: "43d1088e-bbd9-4106-8769-d701862471c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.530173 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "43d1088e-bbd9-4106-8769-d701862471c0" (UID: "43d1088e-bbd9-4106-8769-d701862471c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.539831 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-config" (OuterVolumeSpecName: "config") pod "43d1088e-bbd9-4106-8769-d701862471c0" (UID: "43d1088e-bbd9-4106-8769-d701862471c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.549908 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.549967 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.549984 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r44m7\" (UniqueName: \"kubernetes.io/projected/e65f8db9-c312-4dac-a18c-50684c4a48d6-kube-api-access-r44m7\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.549997 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.550009 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.550021 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e65f8db9-c312-4dac-a18c-50684c4a48d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.550033 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.550045 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.550057 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z2rd\" (UniqueName: \"kubernetes.io/projected/43d1088e-bbd9-4106-8769-d701862471c0-kube-api-access-8z2rd\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:25 crc kubenswrapper[5127]: I0201 08:55:25.550069 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/43d1088e-bbd9-4106-8769-d701862471c0-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:25.999671 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4c47df59-sgbvx"] Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.336147 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-684dd6949-dp77r" event={"ID":"43d1088e-bbd9-4106-8769-d701862471c0","Type":"ContainerDied","Data":"b95f7838bf6022c7de5ed666f8e648d374c37e3680726efba1cba7fc20fa7068"} Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.336462 5127 scope.go:117] "RemoveContainer" containerID="7289210a878fdcf0a0d48ffc87d8cf0d2ae0998f431487dbcf98242ef30ab963" Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.336394 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-684dd6949-dp77r" Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.340804 5127 generic.go:334] "Generic (PLEG): container finished" podID="357483ec-5a75-4748-b811-c05f14bf9753" containerID="8afa87696f5ebee0e40b577267452907db0d9e436a8d1930b18e9eeb624f1225" exitCode=0 Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.340874 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86b4c6959-h82gd" Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.340874 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" event={"ID":"357483ec-5a75-4748-b811-c05f14bf9753","Type":"ContainerDied","Data":"8afa87696f5ebee0e40b577267452907db0d9e436a8d1930b18e9eeb624f1225"} Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.340948 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" event={"ID":"357483ec-5a75-4748-b811-c05f14bf9753","Type":"ContainerStarted","Data":"948d16f47d4c3fb75dcb04e1403fef84362d7b8cc3f11dc1757762c06f4c4a74"} Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.445087 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-684dd6949-dp77r"] Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.472439 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-684dd6949-dp77r"] Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.485635 5127 scope.go:117] "RemoveContainer" containerID="4d515fb69e709ba445f842c483e5b2777d61d8d875886fe48215265000904fa1" Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.495688 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86b4c6959-h82gd"] Feb 01 08:55:26 crc kubenswrapper[5127]: I0201 08:55:26.509310 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86b4c6959-h82gd"] Feb 01 08:55:27 crc kubenswrapper[5127]: I0201 08:55:27.352388 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" event={"ID":"357483ec-5a75-4748-b811-c05f14bf9753","Type":"ContainerStarted","Data":"5f28b6c47afb63b6a8aeb67d852a9d079fb153dc5b1c58b2513933277dbaaf5b"} Feb 01 08:55:27 crc kubenswrapper[5127]: I0201 08:55:27.353036 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:27 crc kubenswrapper[5127]: I0201 08:55:27.377764 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" podStartSLOduration=2.37774239 podStartE2EDuration="2.37774239s" podCreationTimestamp="2026-02-01 08:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 08:55:27.373050554 +0000 UTC m=+7677.858952917" watchObservedRunningTime="2026-02-01 08:55:27.37774239 +0000 UTC m=+7677.863644753" Feb 01 08:55:28 crc kubenswrapper[5127]: I0201 08:55:28.249712 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d1088e-bbd9-4106-8769-d701862471c0" path="/var/lib/kubelet/pods/43d1088e-bbd9-4106-8769-d701862471c0/volumes" Feb 01 08:55:28 crc kubenswrapper[5127]: I0201 08:55:28.250790 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65f8db9-c312-4dac-a18c-50684c4a48d6" path="/var/lib/kubelet/pods/e65f8db9-c312-4dac-a18c-50684c4a48d6/volumes" Feb 01 08:55:29 crc kubenswrapper[5127]: I0201 08:55:29.237544 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:55:29 crc kubenswrapper[5127]: E0201 08:55:29.238257 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:55:35 crc kubenswrapper[5127]: I0201 08:55:35.065833 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9bll9"] Feb 01 08:55:35 crc kubenswrapper[5127]: I0201 08:55:35.076417 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9bll9"] Feb 01 08:55:35 crc kubenswrapper[5127]: I0201 08:55:35.472728 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f4c47df59-sgbvx" Feb 01 08:55:35 crc kubenswrapper[5127]: I0201 08:55:35.532432 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c776fcfb5-57wwt"] Feb 01 08:55:35 crc kubenswrapper[5127]: I0201 08:55:35.532717 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" podUID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerName="dnsmasq-dns" containerID="cri-o://d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6" gracePeriod=10 Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.084566 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.220743 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-openstack-cell1\") pod \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.220811 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmm8h\" (UniqueName: \"kubernetes.io/projected/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-kube-api-access-bmm8h\") pod \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.220880 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-nb\") pod \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.220974 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-sb\") pod \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.220993 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-dns-svc\") pod \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.221051 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-config\") pod \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\" (UID: \"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4\") " Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.227311 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-kube-api-access-bmm8h" (OuterVolumeSpecName: "kube-api-access-bmm8h") pod "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" (UID: "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4"). InnerVolumeSpecName "kube-api-access-bmm8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.252534 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="115ce664-8818-445d-8923-aa37f1ea49f6" path="/var/lib/kubelet/pods/115ce664-8818-445d-8923-aa37f1ea49f6/volumes" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.281717 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-config" (OuterVolumeSpecName: "config") pod "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" (UID: "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.285161 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" (UID: "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.292256 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" (UID: "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.307331 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" (UID: "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.314041 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" (UID: "7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.323905 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmm8h\" (UniqueName: \"kubernetes.io/projected/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-kube-api-access-bmm8h\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.323933 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.323941 5127 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.323950 5127 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.323958 5127 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-config\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.323967 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.444224 5127 generic.go:334] "Generic (PLEG): container finished" podID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerID="d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6" exitCode=0 Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.444264 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" event={"ID":"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4","Type":"ContainerDied","Data":"d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6"} Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.444295 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" event={"ID":"7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4","Type":"ContainerDied","Data":"2501b1be6056f5fa3bb31fcaed3cc1c571765b6bef7ff9745e96f04988cb72fd"} Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.444312 5127 scope.go:117] "RemoveContainer" containerID="d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.444341 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c776fcfb5-57wwt" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.481461 5127 scope.go:117] "RemoveContainer" containerID="32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.482998 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c776fcfb5-57wwt"] Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.494790 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c776fcfb5-57wwt"] Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.515807 5127 scope.go:117] "RemoveContainer" containerID="d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6" Feb 01 08:55:36 crc kubenswrapper[5127]: E0201 08:55:36.516394 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6\": container with ID starting with d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6 not found: ID does not exist" containerID="d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.516451 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6"} err="failed to get container status \"d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6\": rpc error: code = NotFound desc = could not find container \"d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6\": container with ID starting with d7b6f1b501b09c8b26a4ec69cd1021958264566025f932b04d1fa93b2454c5d6 not found: ID does not exist" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.516495 5127 scope.go:117] "RemoveContainer" containerID="32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1" Feb 01 08:55:36 crc kubenswrapper[5127]: E0201 08:55:36.517436 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1\": container with ID starting with 32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1 not found: ID does not exist" containerID="32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1" Feb 01 08:55:36 crc kubenswrapper[5127]: I0201 08:55:36.517479 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1"} err="failed to get container status \"32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1\": rpc error: code = NotFound desc = could not find container \"32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1\": container with ID starting with 32bf843aff576b95c6e527ceea2e891a748e2cac9eb00657bf8453970655c8b1 not found: ID does not exist" Feb 01 08:55:37 crc kubenswrapper[5127]: I0201 08:55:37.027786 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lwm8n"] Feb 01 08:55:37 crc kubenswrapper[5127]: I0201 08:55:37.038302 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lwm8n"] Feb 01 08:55:38 crc kubenswrapper[5127]: I0201 08:55:38.255868 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" path="/var/lib/kubelet/pods/7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4/volumes" Feb 01 08:55:38 crc kubenswrapper[5127]: I0201 08:55:38.257079 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0400c1f-002e-434f-b5b6-b7fec9bca69c" path="/var/lib/kubelet/pods/a0400c1f-002e-434f-b5b6-b7fec9bca69c/volumes" Feb 01 08:55:43 crc kubenswrapper[5127]: I0201 08:55:43.236452 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:55:43 crc kubenswrapper[5127]: E0201 08:55:43.237520 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.904678 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd"] Feb 01 08:55:45 crc kubenswrapper[5127]: E0201 08:55:45.905835 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerName="init" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.905850 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerName="init" Feb 01 08:55:45 crc kubenswrapper[5127]: E0201 08:55:45.905867 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d1088e-bbd9-4106-8769-d701862471c0" containerName="dnsmasq-dns" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.905873 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d1088e-bbd9-4106-8769-d701862471c0" containerName="dnsmasq-dns" Feb 01 08:55:45 crc kubenswrapper[5127]: E0201 08:55:45.905890 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d1088e-bbd9-4106-8769-d701862471c0" containerName="init" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.905896 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d1088e-bbd9-4106-8769-d701862471c0" containerName="init" Feb 01 08:55:45 crc kubenswrapper[5127]: E0201 08:55:45.905928 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerName="dnsmasq-dns" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.905933 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerName="dnsmasq-dns" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.911418 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa17fc9-c47f-4e20-b9e6-0d0782bbffa4" containerName="dnsmasq-dns" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.911519 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d1088e-bbd9-4106-8769-d701862471c0" containerName="dnsmasq-dns" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.944506 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.949222 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd"] Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.951926 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.952810 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.953205 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.953893 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.971628 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv"] Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.974455 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.978035 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.980011 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 08:55:45 crc kubenswrapper[5127]: I0201 08:55:45.993759 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv"] Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.058616 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.058695 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.058856 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.058946 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.059019 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.059075 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvw2s\" (UniqueName: \"kubernetes.io/projected/d0e75902-bec9-4e02-851f-b2a306355649-kube-api-access-lvw2s\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.059318 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmk6l\" (UniqueName: \"kubernetes.io/projected/4c4da666-5d8e-4fee-b952-2b45b93011ad-kube-api-access-cmk6l\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.059439 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.059839 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162059 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162155 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162192 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162239 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162283 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162341 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvw2s\" (UniqueName: \"kubernetes.io/projected/d0e75902-bec9-4e02-851f-b2a306355649-kube-api-access-lvw2s\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162421 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmk6l\" (UniqueName: \"kubernetes.io/projected/4c4da666-5d8e-4fee-b952-2b45b93011ad-kube-api-access-cmk6l\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162482 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.162667 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.171609 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.171761 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.172031 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.175502 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.179120 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.181192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.183023 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.184248 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmk6l\" (UniqueName: \"kubernetes.io/projected/4c4da666-5d8e-4fee-b952-2b45b93011ad-kube-api-access-cmk6l\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.186105 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvw2s\" (UniqueName: \"kubernetes.io/projected/d0e75902-bec9-4e02-851f-b2a306355649-kube-api-access-lvw2s\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.285969 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.297953 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.872283 5127 scope.go:117] "RemoveContainer" containerID="05c491ed359c2ffac9d5cc48b0d73a4a5647f2064cdacfebdaa40d3af42122aa" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.904939 5127 scope.go:117] "RemoveContainer" containerID="92e7ceda27d9704f7f6946ff56e623299cf9c96278bb0dc5e98bd560a309c0b8" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.926003 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv"] Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.963449 5127 scope.go:117] "RemoveContainer" containerID="05c145723f144367b651cce19e3638fabdd06be8bd2ac3ec690c96748f0ac662" Feb 01 08:55:46 crc kubenswrapper[5127]: I0201 08:55:46.992669 5127 scope.go:117] "RemoveContainer" containerID="4626eaaf3f5bed3739a044ecda59c564da01048ffc6d767addd4cf9b14c7e640" Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.022378 5127 scope.go:117] "RemoveContainer" containerID="62b6a8c25fc0540e522e04e13dfd46a0557f953c842b917df6abbcc13d3e0c6d" Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.027081 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd"] Feb 01 08:55:47 crc kubenswrapper[5127]: W0201 08:55:47.030440 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0e75902_bec9_4e02_851f_b2a306355649.slice/crio-316cab6e63c4a390d9eac47db1a20e376074f4ede0e0f36d4c2599b2423316e1 WatchSource:0}: Error finding container 316cab6e63c4a390d9eac47db1a20e376074f4ede0e0f36d4c2599b2423316e1: Status 404 returned error can't find the container with id 316cab6e63c4a390d9eac47db1a20e376074f4ede0e0f36d4c2599b2423316e1 Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.073549 5127 scope.go:117] "RemoveContainer" containerID="5ce192ab93b075c2f785499a1ea0f266ecaced66bd27756e215e70189c6caa43" Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.129254 5127 scope.go:117] "RemoveContainer" containerID="ffafe99877109ce63df8f9a8a6d8496ebe38981275b62aebd8e56368fb78b7ea" Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.150214 5127 scope.go:117] "RemoveContainer" containerID="e7b75988749b05f8c5324faefccbe6122960b33db866cdca50805b8ca0257587" Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.169462 5127 scope.go:117] "RemoveContainer" containerID="7aa19b3a25c83dda21ca109c6fb0130944d644f9913ed7a748ed4326fb3270b3" Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.585148 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" event={"ID":"d0e75902-bec9-4e02-851f-b2a306355649","Type":"ContainerStarted","Data":"316cab6e63c4a390d9eac47db1a20e376074f4ede0e0f36d4c2599b2423316e1"} Feb 01 08:55:47 crc kubenswrapper[5127]: I0201 08:55:47.588146 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" event={"ID":"4c4da666-5d8e-4fee-b952-2b45b93011ad","Type":"ContainerStarted","Data":"d7bd6ee9c1ec63569a03def30cacd9d56fa1e4a912905c6187ee20cc42428dc6"} Feb 01 08:55:55 crc kubenswrapper[5127]: I0201 08:55:55.044451 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7zwl"] Feb 01 08:55:55 crc kubenswrapper[5127]: I0201 08:55:55.049612 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-r7zwl"] Feb 01 08:55:56 crc kubenswrapper[5127]: I0201 08:55:56.237286 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:55:56 crc kubenswrapper[5127]: E0201 08:55:56.238019 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:55:56 crc kubenswrapper[5127]: I0201 08:55:56.250997 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74da345-e546-4397-a7de-e285df23f3fd" path="/var/lib/kubelet/pods/d74da345-e546-4397-a7de-e285df23f3fd/volumes" Feb 01 08:55:56 crc kubenswrapper[5127]: I0201 08:55:56.726815 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" event={"ID":"4c4da666-5d8e-4fee-b952-2b45b93011ad","Type":"ContainerStarted","Data":"88e1e4821dbf461934cc01d41d7b6b6a3af7243d19a0f81aa840daf03e67ad83"} Feb 01 08:55:56 crc kubenswrapper[5127]: I0201 08:55:56.729762 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" event={"ID":"d0e75902-bec9-4e02-851f-b2a306355649","Type":"ContainerStarted","Data":"2d7a34497bfcf62778ac1a58a33b3b930dcd56e6bf3982a0bbd5b2646c8e253d"} Feb 01 08:55:56 crc kubenswrapper[5127]: I0201 08:55:56.764782 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" podStartSLOduration=2.711266884 podStartE2EDuration="11.764758847s" podCreationTimestamp="2026-02-01 08:55:45 +0000 UTC" firstStartedPulling="2026-02-01 08:55:46.963486364 +0000 UTC m=+7697.449388747" lastFinishedPulling="2026-02-01 08:55:56.016978357 +0000 UTC m=+7706.502880710" observedRunningTime="2026-02-01 08:55:56.75257625 +0000 UTC m=+7707.238478633" watchObservedRunningTime="2026-02-01 08:55:56.764758847 +0000 UTC m=+7707.250661230" Feb 01 08:55:56 crc kubenswrapper[5127]: I0201 08:55:56.785831 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" podStartSLOduration=2.83617762 podStartE2EDuration="11.785804452s" podCreationTimestamp="2026-02-01 08:55:45 +0000 UTC" firstStartedPulling="2026-02-01 08:55:47.032459018 +0000 UTC m=+7697.518361381" lastFinishedPulling="2026-02-01 08:55:55.98208585 +0000 UTC m=+7706.467988213" observedRunningTime="2026-02-01 08:55:56.777007566 +0000 UTC m=+7707.262909929" watchObservedRunningTime="2026-02-01 08:55:56.785804452 +0000 UTC m=+7707.271706835" Feb 01 08:56:06 crc kubenswrapper[5127]: I0201 08:56:06.865445 5127 generic.go:334] "Generic (PLEG): container finished" podID="4c4da666-5d8e-4fee-b952-2b45b93011ad" containerID="88e1e4821dbf461934cc01d41d7b6b6a3af7243d19a0f81aa840daf03e67ad83" exitCode=0 Feb 01 08:56:06 crc kubenswrapper[5127]: I0201 08:56:06.865516 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" event={"ID":"4c4da666-5d8e-4fee-b952-2b45b93011ad","Type":"ContainerDied","Data":"88e1e4821dbf461934cc01d41d7b6b6a3af7243d19a0f81aa840daf03e67ad83"} Feb 01 08:56:07 crc kubenswrapper[5127]: I0201 08:56:07.884128 5127 generic.go:334] "Generic (PLEG): container finished" podID="d0e75902-bec9-4e02-851f-b2a306355649" containerID="2d7a34497bfcf62778ac1a58a33b3b930dcd56e6bf3982a0bbd5b2646c8e253d" exitCode=0 Feb 01 08:56:07 crc kubenswrapper[5127]: I0201 08:56:07.884234 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" event={"ID":"d0e75902-bec9-4e02-851f-b2a306355649","Type":"ContainerDied","Data":"2d7a34497bfcf62778ac1a58a33b3b930dcd56e6bf3982a0bbd5b2646c8e253d"} Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.413333 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.500828 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-pre-adoption-validation-combined-ca-bundle\") pod \"4c4da666-5d8e-4fee-b952-2b45b93011ad\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.500923 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmk6l\" (UniqueName: \"kubernetes.io/projected/4c4da666-5d8e-4fee-b952-2b45b93011ad-kube-api-access-cmk6l\") pod \"4c4da666-5d8e-4fee-b952-2b45b93011ad\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.500980 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-inventory\") pod \"4c4da666-5d8e-4fee-b952-2b45b93011ad\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.501081 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-ssh-key-openstack-networker\") pod \"4c4da666-5d8e-4fee-b952-2b45b93011ad\" (UID: \"4c4da666-5d8e-4fee-b952-2b45b93011ad\") " Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.508919 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4da666-5d8e-4fee-b952-2b45b93011ad-kube-api-access-cmk6l" (OuterVolumeSpecName: "kube-api-access-cmk6l") pod "4c4da666-5d8e-4fee-b952-2b45b93011ad" (UID: "4c4da666-5d8e-4fee-b952-2b45b93011ad"). InnerVolumeSpecName "kube-api-access-cmk6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.509983 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "4c4da666-5d8e-4fee-b952-2b45b93011ad" (UID: "4c4da666-5d8e-4fee-b952-2b45b93011ad"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.534093 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-inventory" (OuterVolumeSpecName: "inventory") pod "4c4da666-5d8e-4fee-b952-2b45b93011ad" (UID: "4c4da666-5d8e-4fee-b952-2b45b93011ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.540026 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "4c4da666-5d8e-4fee-b952-2b45b93011ad" (UID: "4c4da666-5d8e-4fee-b952-2b45b93011ad"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.604078 5127 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.604114 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmk6l\" (UniqueName: \"kubernetes.io/projected/4c4da666-5d8e-4fee-b952-2b45b93011ad-kube-api-access-cmk6l\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.604125 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.604133 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/4c4da666-5d8e-4fee-b952-2b45b93011ad-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.899163 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.903249 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv" event={"ID":"4c4da666-5d8e-4fee-b952-2b45b93011ad","Type":"ContainerDied","Data":"d7bd6ee9c1ec63569a03def30cacd9d56fa1e4a912905c6187ee20cc42428dc6"} Feb 01 08:56:08 crc kubenswrapper[5127]: I0201 08:56:08.903869 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7bd6ee9c1ec63569a03def30cacd9d56fa1e4a912905c6187ee20cc42428dc6" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.459307 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.524020 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ssh-key-openstack-cell1\") pod \"d0e75902-bec9-4e02-851f-b2a306355649\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.524075 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-pre-adoption-validation-combined-ca-bundle\") pod \"d0e75902-bec9-4e02-851f-b2a306355649\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.524264 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ceph\") pod \"d0e75902-bec9-4e02-851f-b2a306355649\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.524311 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvw2s\" (UniqueName: \"kubernetes.io/projected/d0e75902-bec9-4e02-851f-b2a306355649-kube-api-access-lvw2s\") pod \"d0e75902-bec9-4e02-851f-b2a306355649\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.524337 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-inventory\") pod \"d0e75902-bec9-4e02-851f-b2a306355649\" (UID: \"d0e75902-bec9-4e02-851f-b2a306355649\") " Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.529467 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "d0e75902-bec9-4e02-851f-b2a306355649" (UID: "d0e75902-bec9-4e02-851f-b2a306355649"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.531281 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ceph" (OuterVolumeSpecName: "ceph") pod "d0e75902-bec9-4e02-851f-b2a306355649" (UID: "d0e75902-bec9-4e02-851f-b2a306355649"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.532248 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e75902-bec9-4e02-851f-b2a306355649-kube-api-access-lvw2s" (OuterVolumeSpecName: "kube-api-access-lvw2s") pod "d0e75902-bec9-4e02-851f-b2a306355649" (UID: "d0e75902-bec9-4e02-851f-b2a306355649"). InnerVolumeSpecName "kube-api-access-lvw2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.552777 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d0e75902-bec9-4e02-851f-b2a306355649" (UID: "d0e75902-bec9-4e02-851f-b2a306355649"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.574227 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-inventory" (OuterVolumeSpecName: "inventory") pod "d0e75902-bec9-4e02-851f-b2a306355649" (UID: "d0e75902-bec9-4e02-851f-b2a306355649"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.626867 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.626904 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvw2s\" (UniqueName: \"kubernetes.io/projected/d0e75902-bec9-4e02-851f-b2a306355649-kube-api-access-lvw2s\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.626917 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.626927 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.626935 5127 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0e75902-bec9-4e02-851f-b2a306355649-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.913446 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" event={"ID":"d0e75902-bec9-4e02-851f-b2a306355649","Type":"ContainerDied","Data":"316cab6e63c4a390d9eac47db1a20e376074f4ede0e0f36d4c2599b2423316e1"} Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.913489 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="316cab6e63c4a390d9eac47db1a20e376074f4ede0e0f36d4c2599b2423316e1" Feb 01 08:56:09 crc kubenswrapper[5127]: I0201 08:56:09.913541 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd" Feb 01 08:56:11 crc kubenswrapper[5127]: I0201 08:56:11.237777 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:56:11 crc kubenswrapper[5127]: E0201 08:56:11.238834 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.748295 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc"] Feb 01 08:56:18 crc kubenswrapper[5127]: E0201 08:56:18.749704 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4da666-5d8e-4fee-b952-2b45b93011ad" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.749732 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4da666-5d8e-4fee-b952-2b45b93011ad" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 01 08:56:18 crc kubenswrapper[5127]: E0201 08:56:18.749783 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e75902-bec9-4e02-851f-b2a306355649" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.749799 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e75902-bec9-4e02-851f-b2a306355649" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.750216 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4da666-5d8e-4fee-b952-2b45b93011ad" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.750241 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e75902-bec9-4e02-851f-b2a306355649" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.751546 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.758575 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.759234 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.759572 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.760809 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg"] Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.762465 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.764686 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.765373 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.767132 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.794468 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc"] Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.814521 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg"] Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.870639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.870936 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-876vk\" (UniqueName: \"kubernetes.io/projected/2017b1bc-65d4-4031-a56c-8aa5399a6446-kube-api-access-876vk\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.870963 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.871146 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.871209 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.871336 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.871380 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.871623 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfpp\" (UniqueName: \"kubernetes.io/projected/833a198f-222c-4ce9-a629-f1138fbd1fce-kube-api-access-xqfpp\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.871916 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.973824 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.973906 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.973939 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-876vk\" (UniqueName: \"kubernetes.io/projected/2017b1bc-65d4-4031-a56c-8aa5399a6446-kube-api-access-876vk\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.973960 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.974001 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.974027 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.974055 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.974076 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.974126 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfpp\" (UniqueName: \"kubernetes.io/projected/833a198f-222c-4ce9-a629-f1138fbd1fce-kube-api-access-xqfpp\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.980848 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.980984 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.991451 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.991811 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.992434 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.993298 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.996015 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfpp\" (UniqueName: \"kubernetes.io/projected/833a198f-222c-4ce9-a629-f1138fbd1fce-kube-api-access-xqfpp\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.996530 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-876vk\" (UniqueName: \"kubernetes.io/projected/2017b1bc-65d4-4031-a56c-8aa5399a6446-kube-api-access-876vk\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:18 crc kubenswrapper[5127]: I0201 08:56:18.997668 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:19 crc kubenswrapper[5127]: I0201 08:56:19.097622 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 08:56:19 crc kubenswrapper[5127]: I0201 08:56:19.114947 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 08:56:19 crc kubenswrapper[5127]: I0201 08:56:19.729793 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg"] Feb 01 08:56:20 crc kubenswrapper[5127]: I0201 08:56:20.038472 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" event={"ID":"2017b1bc-65d4-4031-a56c-8aa5399a6446","Type":"ContainerStarted","Data":"4bc0fb98bf3694666c63d3913763afd20c881d5ece12374b21986b487a38940f"} Feb 01 08:56:20 crc kubenswrapper[5127]: W0201 08:56:20.619732 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833a198f_222c_4ce9_a629_f1138fbd1fce.slice/crio-be882afc2a1930f2c5c6bc567ffda85de59ea5cb273f8ac8840f97422025343c WatchSource:0}: Error finding container be882afc2a1930f2c5c6bc567ffda85de59ea5cb273f8ac8840f97422025343c: Status 404 returned error can't find the container with id be882afc2a1930f2c5c6bc567ffda85de59ea5cb273f8ac8840f97422025343c Feb 01 08:56:20 crc kubenswrapper[5127]: I0201 08:56:20.627342 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc"] Feb 01 08:56:21 crc kubenswrapper[5127]: I0201 08:56:21.058335 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" event={"ID":"833a198f-222c-4ce9-a629-f1138fbd1fce","Type":"ContainerStarted","Data":"be882afc2a1930f2c5c6bc567ffda85de59ea5cb273f8ac8840f97422025343c"} Feb 01 08:56:21 crc kubenswrapper[5127]: I0201 08:56:21.061250 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" event={"ID":"2017b1bc-65d4-4031-a56c-8aa5399a6446","Type":"ContainerStarted","Data":"47917f781cdff0ae02f94b0a564840f5a8596e553b2a70630c88e296763a3a9a"} Feb 01 08:56:22 crc kubenswrapper[5127]: I0201 08:56:22.078687 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" event={"ID":"833a198f-222c-4ce9-a629-f1138fbd1fce","Type":"ContainerStarted","Data":"978804468bbc54fe25d88054c589ac8516b7af10d4852c7eea107c1fcfb042a7"} Feb 01 08:56:22 crc kubenswrapper[5127]: I0201 08:56:22.115692 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" podStartSLOduration=3.713820502 podStartE2EDuration="4.115668029s" podCreationTimestamp="2026-02-01 08:56:18 +0000 UTC" firstStartedPulling="2026-02-01 08:56:20.622606806 +0000 UTC m=+7731.108509159" lastFinishedPulling="2026-02-01 08:56:21.024454313 +0000 UTC m=+7731.510356686" observedRunningTime="2026-02-01 08:56:22.103186354 +0000 UTC m=+7732.589088767" watchObservedRunningTime="2026-02-01 08:56:22.115668029 +0000 UTC m=+7732.601570402" Feb 01 08:56:22 crc kubenswrapper[5127]: I0201 08:56:22.119003 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" podStartSLOduration=3.703060885 podStartE2EDuration="4.118992249s" podCreationTimestamp="2026-02-01 08:56:18 +0000 UTC" firstStartedPulling="2026-02-01 08:56:19.735860404 +0000 UTC m=+7730.221762787" lastFinishedPulling="2026-02-01 08:56:20.151791778 +0000 UTC m=+7730.637694151" observedRunningTime="2026-02-01 08:56:21.084008993 +0000 UTC m=+7731.569911366" watchObservedRunningTime="2026-02-01 08:56:22.118992249 +0000 UTC m=+7732.604894622" Feb 01 08:56:22 crc kubenswrapper[5127]: I0201 08:56:22.239351 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:56:22 crc kubenswrapper[5127]: E0201 08:56:22.239661 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:56:36 crc kubenswrapper[5127]: I0201 08:56:36.236346 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:56:36 crc kubenswrapper[5127]: E0201 08:56:36.237216 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 08:56:38 crc kubenswrapper[5127]: I0201 08:56:38.074922 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-p2tgk"] Feb 01 08:56:38 crc kubenswrapper[5127]: I0201 08:56:38.093483 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-p2tgk"] Feb 01 08:56:38 crc kubenswrapper[5127]: I0201 08:56:38.265454 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2af6f30-000e-4d33-ae4c-e26cdd6ee07a" path="/var/lib/kubelet/pods/a2af6f30-000e-4d33-ae4c-e26cdd6ee07a/volumes" Feb 01 08:56:39 crc kubenswrapper[5127]: I0201 08:56:39.030236 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-1d5c-account-create-update-m2g4t"] Feb 01 08:56:39 crc kubenswrapper[5127]: I0201 08:56:39.040769 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-1d5c-account-create-update-m2g4t"] Feb 01 08:56:40 crc kubenswrapper[5127]: I0201 08:56:40.249373 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337262d4-08b6-4c72-82eb-7b5c230e384b" path="/var/lib/kubelet/pods/337262d4-08b6-4c72-82eb-7b5c230e384b/volumes" Feb 01 08:56:47 crc kubenswrapper[5127]: I0201 08:56:47.451436 5127 scope.go:117] "RemoveContainer" containerID="2ae06785aebd6b6f7c3ede94dbdcf521f3d0ae20c6bf76e960c6c1bb0e5b897c" Feb 01 08:56:47 crc kubenswrapper[5127]: I0201 08:56:47.525673 5127 scope.go:117] "RemoveContainer" containerID="0b1cfd03845ed98280b6782c940383e5524925d7e38d307d05c32d3e7aa8ef3f" Feb 01 08:56:47 crc kubenswrapper[5127]: I0201 08:56:47.559360 5127 scope.go:117] "RemoveContainer" containerID="c1504ede8b2f6da26e2c57585acad76bdd4f079215e12ab362e163b6ac73e19a" Feb 01 08:56:49 crc kubenswrapper[5127]: I0201 08:56:49.236323 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 08:56:50 crc kubenswrapper[5127]: I0201 08:56:50.422735 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"b4ceab963545525742e7dbbad1c6ab5e57888ffbb6bdc3d59e9d5f3f5ac47153"} Feb 01 08:57:23 crc kubenswrapper[5127]: I0201 08:57:23.072458 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-b8lmt"] Feb 01 08:57:23 crc kubenswrapper[5127]: I0201 08:57:23.091972 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-b8lmt"] Feb 01 08:57:24 crc kubenswrapper[5127]: I0201 08:57:24.250658 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb1b5ac-6358-4824-8649-5c48340d4349" path="/var/lib/kubelet/pods/ccb1b5ac-6358-4824-8649-5c48340d4349/volumes" Feb 01 08:57:47 crc kubenswrapper[5127]: I0201 08:57:47.715902 5127 scope.go:117] "RemoveContainer" containerID="1fcbba84458820a6cd56e7847b84698f6911603ed0c37b59c69400f84044c11e" Feb 01 08:59:06 crc kubenswrapper[5127]: I0201 08:59:06.740663 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:59:06 crc kubenswrapper[5127]: I0201 08:59:06.741210 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:59:36 crc kubenswrapper[5127]: I0201 08:59:36.741123 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 08:59:36 crc kubenswrapper[5127]: I0201 08:59:36.741880 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 08:59:47 crc kubenswrapper[5127]: I0201 08:59:47.837803 5127 scope.go:117] "RemoveContainer" containerID="cab38501d75c76307aa9e3889abf1afc81c1433236c8c7e9b4f9ce70606a253f" Feb 01 08:59:47 crc kubenswrapper[5127]: I0201 08:59:47.883921 5127 scope.go:117] "RemoveContainer" containerID="0a7badc1a06a46ac9174424344c02d89f78cbdd3ce09bc80919713ee4ac9e823" Feb 01 08:59:47 crc kubenswrapper[5127]: I0201 08:59:47.918977 5127 scope.go:117] "RemoveContainer" containerID="bee0c921be2ee8396b0f6b5247af52d5007c1b2cacb9f22613676c520d165b0b" Feb 01 08:59:47 crc kubenswrapper[5127]: I0201 08:59:47.943438 5127 scope.go:117] "RemoveContainer" containerID="68e0e6dff98419c8db624beacab38238758bb0acfb839cbd1d2623f45684b74a" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.181927 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl"] Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.184428 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.198837 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl"] Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.211779 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.213866 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.330791 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9dz\" (UniqueName: \"kubernetes.io/projected/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-kube-api-access-qq9dz\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.330971 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-config-volume\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.331005 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-secret-volume\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.433273 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-config-volume\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.433634 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-secret-volume\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.433832 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9dz\" (UniqueName: \"kubernetes.io/projected/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-kube-api-access-qq9dz\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.434363 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-config-volume\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.441082 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-secret-volume\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.453436 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9dz\" (UniqueName: \"kubernetes.io/projected/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-kube-api-access-qq9dz\") pod \"collect-profiles-29498940-hlxfl\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.527950 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:00 crc kubenswrapper[5127]: I0201 09:00:00.995282 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl"] Feb 01 09:00:01 crc kubenswrapper[5127]: W0201 09:00:01.022219 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod445bd8b4_0a63_4b4d_a81d_d7c9cb4ba049.slice/crio-e21d299fdd602be760e6395aa81c11861fb996e55a858d0d93099cd87f6e859f WatchSource:0}: Error finding container e21d299fdd602be760e6395aa81c11861fb996e55a858d0d93099cd87f6e859f: Status 404 returned error can't find the container with id e21d299fdd602be760e6395aa81c11861fb996e55a858d0d93099cd87f6e859f Feb 01 09:00:01 crc kubenswrapper[5127]: I0201 09:00:01.623214 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" event={"ID":"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049","Type":"ContainerStarted","Data":"dbf8b0f82807a556d598bf436c2ef0c0a14d78cc278a0509b4c0b8e1ed3f9579"} Feb 01 09:00:01 crc kubenswrapper[5127]: I0201 09:00:01.623705 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" event={"ID":"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049","Type":"ContainerStarted","Data":"e21d299fdd602be760e6395aa81c11861fb996e55a858d0d93099cd87f6e859f"} Feb 01 09:00:01 crc kubenswrapper[5127]: I0201 09:00:01.650713 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" podStartSLOduration=1.6506913060000001 podStartE2EDuration="1.650691306s" podCreationTimestamp="2026-02-01 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:00:01.641957021 +0000 UTC m=+7952.127859384" watchObservedRunningTime="2026-02-01 09:00:01.650691306 +0000 UTC m=+7952.136593669" Feb 01 09:00:02 crc kubenswrapper[5127]: I0201 09:00:02.637796 5127 generic.go:334] "Generic (PLEG): container finished" podID="445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" containerID="dbf8b0f82807a556d598bf436c2ef0c0a14d78cc278a0509b4c0b8e1ed3f9579" exitCode=0 Feb 01 09:00:02 crc kubenswrapper[5127]: I0201 09:00:02.637878 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" event={"ID":"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049","Type":"ContainerDied","Data":"dbf8b0f82807a556d598bf436c2ef0c0a14d78cc278a0509b4c0b8e1ed3f9579"} Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.047879 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.121277 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-config-volume\") pod \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.121399 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-secret-volume\") pod \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.121553 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq9dz\" (UniqueName: \"kubernetes.io/projected/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-kube-api-access-qq9dz\") pod \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\" (UID: \"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049\") " Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.123447 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-config-volume" (OuterVolumeSpecName: "config-volume") pod "445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" (UID: "445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.130935 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-kube-api-access-qq9dz" (OuterVolumeSpecName: "kube-api-access-qq9dz") pod "445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" (UID: "445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049"). InnerVolumeSpecName "kube-api-access-qq9dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.132056 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" (UID: "445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.224273 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.224312 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.224326 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq9dz\" (UniqueName: \"kubernetes.io/projected/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049-kube-api-access-qq9dz\") on node \"crc\" DevicePath \"\"" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.658542 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" event={"ID":"445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049","Type":"ContainerDied","Data":"e21d299fdd602be760e6395aa81c11861fb996e55a858d0d93099cd87f6e859f"} Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.658661 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e21d299fdd602be760e6395aa81c11861fb996e55a858d0d93099cd87f6e859f" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.658676 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl" Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.741796 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4"] Feb 01 09:00:04 crc kubenswrapper[5127]: I0201 09:00:04.750760 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498895-xrkj4"] Feb 01 09:00:06 crc kubenswrapper[5127]: I0201 09:00:06.250539 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73f741ee-d517-4882-9f2d-c940b1dfa333" path="/var/lib/kubelet/pods/73f741ee-d517-4882-9f2d-c940b1dfa333/volumes" Feb 01 09:00:06 crc kubenswrapper[5127]: I0201 09:00:06.741151 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:00:06 crc kubenswrapper[5127]: I0201 09:00:06.741248 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:00:06 crc kubenswrapper[5127]: I0201 09:00:06.741325 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:00:06 crc kubenswrapper[5127]: I0201 09:00:06.742564 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4ceab963545525742e7dbbad1c6ab5e57888ffbb6bdc3d59e9d5f3f5ac47153"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:00:06 crc kubenswrapper[5127]: I0201 09:00:06.742748 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://b4ceab963545525742e7dbbad1c6ab5e57888ffbb6bdc3d59e9d5f3f5ac47153" gracePeriod=600 Feb 01 09:00:07 crc kubenswrapper[5127]: I0201 09:00:07.696990 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="b4ceab963545525742e7dbbad1c6ab5e57888ffbb6bdc3d59e9d5f3f5ac47153" exitCode=0 Feb 01 09:00:07 crc kubenswrapper[5127]: I0201 09:00:07.697071 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"b4ceab963545525742e7dbbad1c6ab5e57888ffbb6bdc3d59e9d5f3f5ac47153"} Feb 01 09:00:07 crc kubenswrapper[5127]: I0201 09:00:07.698296 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929"} Feb 01 09:00:07 crc kubenswrapper[5127]: I0201 09:00:07.698353 5127 scope.go:117] "RemoveContainer" containerID="8c595937cd8bc442c5d860491db2d15900af9b3e555185c05cbc8d3fd58d20bf" Feb 01 09:00:48 crc kubenswrapper[5127]: I0201 09:00:48.033221 5127 scope.go:117] "RemoveContainer" containerID="983bbad0d04e7290731fcff2ddd90c166ca9e264f3ee9e8403d19c886a59675f" Feb 01 09:00:53 crc kubenswrapper[5127]: I0201 09:00:53.059197 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-2dc4-account-create-update-jgplp"] Feb 01 09:00:53 crc kubenswrapper[5127]: I0201 09:00:53.077959 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-wmmqn"] Feb 01 09:00:53 crc kubenswrapper[5127]: I0201 09:00:53.093059 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-wmmqn"] Feb 01 09:00:53 crc kubenswrapper[5127]: I0201 09:00:53.103113 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-2dc4-account-create-update-jgplp"] Feb 01 09:00:54 crc kubenswrapper[5127]: I0201 09:00:54.247330 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1cb163-c7c2-48e7-8b6f-645a3aac9f08" path="/var/lib/kubelet/pods/4a1cb163-c7c2-48e7-8b6f-645a3aac9f08/volumes" Feb 01 09:00:54 crc kubenswrapper[5127]: I0201 09:00:54.248770 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaacf6b8-b97a-4b64-af63-8488a4b422e2" path="/var/lib/kubelet/pods/eaacf6b8-b97a-4b64-af63-8488a4b422e2/volumes" Feb 01 09:00:58 crc kubenswrapper[5127]: I0201 09:00:58.322717 5127 generic.go:334] "Generic (PLEG): container finished" podID="2017b1bc-65d4-4031-a56c-8aa5399a6446" containerID="47917f781cdff0ae02f94b0a564840f5a8596e553b2a70630c88e296763a3a9a" exitCode=0 Feb 01 09:00:58 crc kubenswrapper[5127]: I0201 09:00:58.322956 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" event={"ID":"2017b1bc-65d4-4031-a56c-8aa5399a6446","Type":"ContainerDied","Data":"47917f781cdff0ae02f94b0a564840f5a8596e553b2a70630c88e296763a3a9a"} Feb 01 09:00:59 crc kubenswrapper[5127]: I0201 09:00:59.903845 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.030809 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-ssh-key-openstack-networker\") pod \"2017b1bc-65d4-4031-a56c-8aa5399a6446\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.031235 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-inventory\") pod \"2017b1bc-65d4-4031-a56c-8aa5399a6446\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.031448 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-tripleo-cleanup-combined-ca-bundle\") pod \"2017b1bc-65d4-4031-a56c-8aa5399a6446\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.031593 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-876vk\" (UniqueName: \"kubernetes.io/projected/2017b1bc-65d4-4031-a56c-8aa5399a6446-kube-api-access-876vk\") pod \"2017b1bc-65d4-4031-a56c-8aa5399a6446\" (UID: \"2017b1bc-65d4-4031-a56c-8aa5399a6446\") " Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.037310 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2017b1bc-65d4-4031-a56c-8aa5399a6446-kube-api-access-876vk" (OuterVolumeSpecName: "kube-api-access-876vk") pod "2017b1bc-65d4-4031-a56c-8aa5399a6446" (UID: "2017b1bc-65d4-4031-a56c-8aa5399a6446"). InnerVolumeSpecName "kube-api-access-876vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.048885 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "2017b1bc-65d4-4031-a56c-8aa5399a6446" (UID: "2017b1bc-65d4-4031-a56c-8aa5399a6446"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.059869 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-inventory" (OuterVolumeSpecName: "inventory") pod "2017b1bc-65d4-4031-a56c-8aa5399a6446" (UID: "2017b1bc-65d4-4031-a56c-8aa5399a6446"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.066334 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "2017b1bc-65d4-4031-a56c-8aa5399a6446" (UID: "2017b1bc-65d4-4031-a56c-8aa5399a6446"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.134361 5127 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.134397 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-876vk\" (UniqueName: \"kubernetes.io/projected/2017b1bc-65d4-4031-a56c-8aa5399a6446-kube-api-access-876vk\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.134409 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.134421 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2017b1bc-65d4-4031-a56c-8aa5399a6446-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.146475 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29498941-gpxpc"] Feb 01 09:01:00 crc kubenswrapper[5127]: E0201 09:01:00.147720 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" containerName="collect-profiles" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.147745 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" containerName="collect-profiles" Feb 01 09:01:00 crc kubenswrapper[5127]: E0201 09:01:00.147764 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2017b1bc-65d4-4031-a56c-8aa5399a6446" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.147771 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2017b1bc-65d4-4031-a56c-8aa5399a6446" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.147982 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" containerName="collect-profiles" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.148003 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2017b1bc-65d4-4031-a56c-8aa5399a6446" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.148861 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.161668 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498941-gpxpc"] Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.238527 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-combined-ca-bundle\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.238653 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt67h\" (UniqueName: \"kubernetes.io/projected/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-kube-api-access-kt67h\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.238730 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-fernet-keys\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.238826 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-config-data\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.341131 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-combined-ca-bundle\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.341222 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt67h\" (UniqueName: \"kubernetes.io/projected/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-kube-api-access-kt67h\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.341324 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-fernet-keys\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.341369 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-config-data\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.348007 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-fernet-keys\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.348064 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.348009 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg" event={"ID":"2017b1bc-65d4-4031-a56c-8aa5399a6446","Type":"ContainerDied","Data":"4bc0fb98bf3694666c63d3913763afd20c881d5ece12374b21986b487a38940f"} Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.348103 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bc0fb98bf3694666c63d3913763afd20c881d5ece12374b21986b487a38940f" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.349565 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-combined-ca-bundle\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.351814 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-config-data\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.377362 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt67h\" (UniqueName: \"kubernetes.io/projected/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-kube-api-access-kt67h\") pod \"keystone-cron-29498941-gpxpc\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.486497 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:00 crc kubenswrapper[5127]: I0201 09:01:00.988454 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29498941-gpxpc"] Feb 01 09:01:01 crc kubenswrapper[5127]: I0201 09:01:01.359340 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-gpxpc" event={"ID":"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0","Type":"ContainerStarted","Data":"a590b55f2fe6552e456a724b588ed309b9f2055599a1c98fcedf3c8c98162aac"} Feb 01 09:01:01 crc kubenswrapper[5127]: I0201 09:01:01.359711 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-gpxpc" event={"ID":"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0","Type":"ContainerStarted","Data":"052077a7f2d26614c72f04e972082054f89105b9c579570faa1a49713e1be7c2"} Feb 01 09:01:01 crc kubenswrapper[5127]: I0201 09:01:01.389368 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29498941-gpxpc" podStartSLOduration=1.389344163 podStartE2EDuration="1.389344163s" podCreationTimestamp="2026-02-01 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:01:01.380132215 +0000 UTC m=+8011.866034578" watchObservedRunningTime="2026-02-01 09:01:01.389344163 +0000 UTC m=+8011.875246536" Feb 01 09:01:04 crc kubenswrapper[5127]: I0201 09:01:04.392058 5127 generic.go:334] "Generic (PLEG): container finished" podID="d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" containerID="a590b55f2fe6552e456a724b588ed309b9f2055599a1c98fcedf3c8c98162aac" exitCode=0 Feb 01 09:01:04 crc kubenswrapper[5127]: I0201 09:01:04.392173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-gpxpc" event={"ID":"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0","Type":"ContainerDied","Data":"a590b55f2fe6552e456a724b588ed309b9f2055599a1c98fcedf3c8c98162aac"} Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.783800 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.873396 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-combined-ca-bundle\") pod \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.873458 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-fernet-keys\") pod \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.873527 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt67h\" (UniqueName: \"kubernetes.io/projected/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-kube-api-access-kt67h\") pod \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.873807 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-config-data\") pod \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\" (UID: \"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0\") " Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.882955 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" (UID: "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.883185 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-kube-api-access-kt67h" (OuterVolumeSpecName: "kube-api-access-kt67h") pod "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" (UID: "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0"). InnerVolumeSpecName "kube-api-access-kt67h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.907289 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" (UID: "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.952101 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-config-data" (OuterVolumeSpecName: "config-data") pod "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" (UID: "d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.976619 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.976957 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.976976 5127 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:05 crc kubenswrapper[5127]: I0201 09:01:05.976991 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt67h\" (UniqueName: \"kubernetes.io/projected/d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0-kube-api-access-kt67h\") on node \"crc\" DevicePath \"\"" Feb 01 09:01:06 crc kubenswrapper[5127]: I0201 09:01:06.414444 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29498941-gpxpc" event={"ID":"d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0","Type":"ContainerDied","Data":"052077a7f2d26614c72f04e972082054f89105b9c579570faa1a49713e1be7c2"} Feb 01 09:01:06 crc kubenswrapper[5127]: I0201 09:01:06.414480 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="052077a7f2d26614c72f04e972082054f89105b9c579570faa1a49713e1be7c2" Feb 01 09:01:06 crc kubenswrapper[5127]: I0201 09:01:06.414569 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29498941-gpxpc" Feb 01 09:01:08 crc kubenswrapper[5127]: I0201 09:01:08.057639 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-649mx"] Feb 01 09:01:08 crc kubenswrapper[5127]: I0201 09:01:08.070574 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-649mx"] Feb 01 09:01:08 crc kubenswrapper[5127]: I0201 09:01:08.247720 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d893f5d8-79e7-4f3f-b16f-779eec683eda" path="/var/lib/kubelet/pods/d893f5d8-79e7-4f3f-b16f-779eec683eda/volumes" Feb 01 09:01:48 crc kubenswrapper[5127]: I0201 09:01:48.132936 5127 scope.go:117] "RemoveContainer" containerID="03ef4e34b84e5f0ba021148c2642d16c20dd3c5825c1a1bcb895e91ea0d3671f" Feb 01 09:01:48 crc kubenswrapper[5127]: I0201 09:01:48.179681 5127 scope.go:117] "RemoveContainer" containerID="dc7726286e755c8d59d912b097f90e645f5d736ad054468d1a9a4c91a4792592" Feb 01 09:01:48 crc kubenswrapper[5127]: I0201 09:01:48.242004 5127 scope.go:117] "RemoveContainer" containerID="9a04efd0776c9fd65d0393c1fc10ccc2009d239be2b2bcf29624744258bd48a5" Feb 01 09:02:09 crc kubenswrapper[5127]: I0201 09:02:09.938282 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4s8zg"] Feb 01 09:02:09 crc kubenswrapper[5127]: E0201 09:02:09.939370 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" containerName="keystone-cron" Feb 01 09:02:09 crc kubenswrapper[5127]: I0201 09:02:09.939385 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" containerName="keystone-cron" Feb 01 09:02:09 crc kubenswrapper[5127]: I0201 09:02:09.939625 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0" containerName="keystone-cron" Feb 01 09:02:09 crc kubenswrapper[5127]: I0201 09:02:09.941229 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:09 crc kubenswrapper[5127]: I0201 09:02:09.959734 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4s8zg"] Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.117094 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-utilities\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.117155 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-catalog-content\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.117765 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrq5m\" (UniqueName: \"kubernetes.io/projected/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-kube-api-access-xrq5m\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.220205 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrq5m\" (UniqueName: \"kubernetes.io/projected/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-kube-api-access-xrq5m\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.220281 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-utilities\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.220304 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-catalog-content\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.221080 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-catalog-content\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.221113 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-utilities\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.242347 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrq5m\" (UniqueName: \"kubernetes.io/projected/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-kube-api-access-xrq5m\") pod \"redhat-operators-4s8zg\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.276187 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:10 crc kubenswrapper[5127]: I0201 09:02:10.781056 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4s8zg"] Feb 01 09:02:11 crc kubenswrapper[5127]: I0201 09:02:11.157754 5127 generic.go:334] "Generic (PLEG): container finished" podID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerID="61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920" exitCode=0 Feb 01 09:02:11 crc kubenswrapper[5127]: I0201 09:02:11.157824 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s8zg" event={"ID":"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65","Type":"ContainerDied","Data":"61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920"} Feb 01 09:02:11 crc kubenswrapper[5127]: I0201 09:02:11.158140 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s8zg" event={"ID":"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65","Type":"ContainerStarted","Data":"e5bd249a1a588630cd1890b9d550294efad57f78376e96023f7c6898b3820cb6"} Feb 01 09:02:11 crc kubenswrapper[5127]: I0201 09:02:11.160068 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:02:12 crc kubenswrapper[5127]: I0201 09:02:12.190970 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s8zg" event={"ID":"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65","Type":"ContainerStarted","Data":"4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e"} Feb 01 09:02:17 crc kubenswrapper[5127]: I0201 09:02:17.248937 5127 generic.go:334] "Generic (PLEG): container finished" podID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerID="4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e" exitCode=0 Feb 01 09:02:17 crc kubenswrapper[5127]: I0201 09:02:17.249125 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s8zg" event={"ID":"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65","Type":"ContainerDied","Data":"4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e"} Feb 01 09:02:18 crc kubenswrapper[5127]: I0201 09:02:18.262556 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s8zg" event={"ID":"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65","Type":"ContainerStarted","Data":"d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba"} Feb 01 09:02:18 crc kubenswrapper[5127]: I0201 09:02:18.305517 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4s8zg" podStartSLOduration=2.753822583 podStartE2EDuration="9.30549168s" podCreationTimestamp="2026-02-01 09:02:09 +0000 UTC" firstStartedPulling="2026-02-01 09:02:11.159766764 +0000 UTC m=+8081.645669147" lastFinishedPulling="2026-02-01 09:02:17.711435881 +0000 UTC m=+8088.197338244" observedRunningTime="2026-02-01 09:02:18.291570047 +0000 UTC m=+8088.777472450" watchObservedRunningTime="2026-02-01 09:02:18.30549168 +0000 UTC m=+8088.791394053" Feb 01 09:02:20 crc kubenswrapper[5127]: I0201 09:02:20.276410 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:20 crc kubenswrapper[5127]: I0201 09:02:20.276839 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:21 crc kubenswrapper[5127]: I0201 09:02:21.360390 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4s8zg" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="registry-server" probeResult="failure" output=< Feb 01 09:02:21 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:02:21 crc kubenswrapper[5127]: > Feb 01 09:02:30 crc kubenswrapper[5127]: I0201 09:02:30.508523 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:30 crc kubenswrapper[5127]: I0201 09:02:30.579222 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:30 crc kubenswrapper[5127]: I0201 09:02:30.770906 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4s8zg"] Feb 01 09:02:32 crc kubenswrapper[5127]: I0201 09:02:32.430558 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4s8zg" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="registry-server" containerID="cri-o://d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba" gracePeriod=2 Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.025438 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.096766 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-utilities\") pod \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.096924 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrq5m\" (UniqueName: \"kubernetes.io/projected/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-kube-api-access-xrq5m\") pod \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.096977 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-catalog-content\") pod \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\" (UID: \"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65\") " Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.098057 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-utilities" (OuterVolumeSpecName: "utilities") pod "0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" (UID: "0d63f4c0-d20d-4f41-b5cd-cf5941c79d65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.103801 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-kube-api-access-xrq5m" (OuterVolumeSpecName: "kube-api-access-xrq5m") pod "0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" (UID: "0d63f4c0-d20d-4f41-b5cd-cf5941c79d65"). InnerVolumeSpecName "kube-api-access-xrq5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.200017 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.200286 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrq5m\" (UniqueName: \"kubernetes.io/projected/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-kube-api-access-xrq5m\") on node \"crc\" DevicePath \"\"" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.234891 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" (UID: "0d63f4c0-d20d-4f41-b5cd-cf5941c79d65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.304856 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.442695 5127 generic.go:334] "Generic (PLEG): container finished" podID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerID="d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba" exitCode=0 Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.442781 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s8zg" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.442812 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s8zg" event={"ID":"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65","Type":"ContainerDied","Data":"d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba"} Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.443877 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s8zg" event={"ID":"0d63f4c0-d20d-4f41-b5cd-cf5941c79d65","Type":"ContainerDied","Data":"e5bd249a1a588630cd1890b9d550294efad57f78376e96023f7c6898b3820cb6"} Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.443981 5127 scope.go:117] "RemoveContainer" containerID="d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.481615 5127 scope.go:117] "RemoveContainer" containerID="4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.486344 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4s8zg"] Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.496349 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4s8zg"] Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.529297 5127 scope.go:117] "RemoveContainer" containerID="61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.570784 5127 scope.go:117] "RemoveContainer" containerID="d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba" Feb 01 09:02:33 crc kubenswrapper[5127]: E0201 09:02:33.571356 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba\": container with ID starting with d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba not found: ID does not exist" containerID="d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.571405 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba"} err="failed to get container status \"d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba\": rpc error: code = NotFound desc = could not find container \"d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba\": container with ID starting with d3d443c57ece956983a51e221439e5ac7945a7647e444f31e3561dee9a87a0ba not found: ID does not exist" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.571433 5127 scope.go:117] "RemoveContainer" containerID="4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e" Feb 01 09:02:33 crc kubenswrapper[5127]: E0201 09:02:33.571845 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e\": container with ID starting with 4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e not found: ID does not exist" containerID="4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.571875 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e"} err="failed to get container status \"4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e\": rpc error: code = NotFound desc = could not find container \"4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e\": container with ID starting with 4c35a7da3324b128401d8624879d13bfaa32519a7853e330649394742968d49e not found: ID does not exist" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.571896 5127 scope.go:117] "RemoveContainer" containerID="61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920" Feb 01 09:02:33 crc kubenswrapper[5127]: E0201 09:02:33.572184 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920\": container with ID starting with 61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920 not found: ID does not exist" containerID="61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920" Feb 01 09:02:33 crc kubenswrapper[5127]: I0201 09:02:33.572209 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920"} err="failed to get container status \"61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920\": rpc error: code = NotFound desc = could not find container \"61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920\": container with ID starting with 61fac824caae8c234c24e9cde1d2fccf5ec3179db76afe67de1b1e5491ffb920 not found: ID does not exist" Feb 01 09:02:34 crc kubenswrapper[5127]: I0201 09:02:34.256205 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" path="/var/lib/kubelet/pods/0d63f4c0-d20d-4f41-b5cd-cf5941c79d65/volumes" Feb 01 09:02:36 crc kubenswrapper[5127]: I0201 09:02:36.741374 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:02:36 crc kubenswrapper[5127]: I0201 09:02:36.742015 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:03:06 crc kubenswrapper[5127]: I0201 09:03:06.741251 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:03:06 crc kubenswrapper[5127]: I0201 09:03:06.741994 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:03:23 crc kubenswrapper[5127]: I0201 09:03:23.054253 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-lcpbf"] Feb 01 09:03:23 crc kubenswrapper[5127]: I0201 09:03:23.067839 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-84b9-account-create-update-zttlk"] Feb 01 09:03:23 crc kubenswrapper[5127]: I0201 09:03:23.081705 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-84b9-account-create-update-zttlk"] Feb 01 09:03:23 crc kubenswrapper[5127]: I0201 09:03:23.094549 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-lcpbf"] Feb 01 09:03:24 crc kubenswrapper[5127]: I0201 09:03:24.251262 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a45274-c5ce-49fa-a520-d69f46c46b93" path="/var/lib/kubelet/pods/b0a45274-c5ce-49fa-a520-d69f46c46b93/volumes" Feb 01 09:03:24 crc kubenswrapper[5127]: I0201 09:03:24.252087 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93aa97d-5839-45fb-91a0-dd94e95495fa" path="/var/lib/kubelet/pods/e93aa97d-5839-45fb-91a0-dd94e95495fa/volumes" Feb 01 09:03:35 crc kubenswrapper[5127]: I0201 09:03:35.052764 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-wlfqn"] Feb 01 09:03:35 crc kubenswrapper[5127]: I0201 09:03:35.063247 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-wlfqn"] Feb 01 09:03:36 crc kubenswrapper[5127]: I0201 09:03:36.248935 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8" path="/var/lib/kubelet/pods/cf3a4882-e282-4a1f-ae54-1ef4aa54d1d8/volumes" Feb 01 09:03:36 crc kubenswrapper[5127]: I0201 09:03:36.740855 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:03:36 crc kubenswrapper[5127]: I0201 09:03:36.740920 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:03:36 crc kubenswrapper[5127]: I0201 09:03:36.740987 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:03:36 crc kubenswrapper[5127]: I0201 09:03:36.741706 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:03:36 crc kubenswrapper[5127]: I0201 09:03:36.741767 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" gracePeriod=600 Feb 01 09:03:36 crc kubenswrapper[5127]: E0201 09:03:36.863515 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:03:37 crc kubenswrapper[5127]: I0201 09:03:37.291595 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" exitCode=0 Feb 01 09:03:37 crc kubenswrapper[5127]: I0201 09:03:37.291634 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929"} Feb 01 09:03:37 crc kubenswrapper[5127]: I0201 09:03:37.291880 5127 scope.go:117] "RemoveContainer" containerID="b4ceab963545525742e7dbbad1c6ab5e57888ffbb6bdc3d59e9d5f3f5ac47153" Feb 01 09:03:37 crc kubenswrapper[5127]: I0201 09:03:37.292843 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:03:37 crc kubenswrapper[5127]: E0201 09:03:37.293296 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:03:48 crc kubenswrapper[5127]: I0201 09:03:48.416450 5127 scope.go:117] "RemoveContainer" containerID="605a99769321cade893991a0145c19b2ab7d6c8570067852615a0ad38846adc6" Feb 01 09:03:48 crc kubenswrapper[5127]: I0201 09:03:48.469311 5127 scope.go:117] "RemoveContainer" containerID="b5479f2e38ac4243bc5a0ea8dba1c792c416ea275a16c05b7a6eddcf799e8836" Feb 01 09:03:48 crc kubenswrapper[5127]: I0201 09:03:48.508023 5127 scope.go:117] "RemoveContainer" containerID="3eb9d93708f4f0e8cc469dd1e54b48d11fb4eba439d7cf79f7aeb2d60c3064f5" Feb 01 09:03:52 crc kubenswrapper[5127]: I0201 09:03:52.235746 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:03:52 crc kubenswrapper[5127]: E0201 09:03:52.236860 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:03:53 crc kubenswrapper[5127]: I0201 09:03:53.037220 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-7s4ch"] Feb 01 09:03:53 crc kubenswrapper[5127]: I0201 09:03:53.046170 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-7s4ch"] Feb 01 09:03:54 crc kubenswrapper[5127]: I0201 09:03:54.040452 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-85ba-account-create-update-ldkxf"] Feb 01 09:03:54 crc kubenswrapper[5127]: I0201 09:03:54.054977 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-85ba-account-create-update-ldkxf"] Feb 01 09:03:54 crc kubenswrapper[5127]: I0201 09:03:54.248620 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41691139-f2d8-4a68-aa1e-205689c2b1c6" path="/var/lib/kubelet/pods/41691139-f2d8-4a68-aa1e-205689c2b1c6/volumes" Feb 01 09:03:54 crc kubenswrapper[5127]: I0201 09:03:54.249349 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8e7f56-b3ce-4221-b098-ea7756476a7a" path="/var/lib/kubelet/pods/4d8e7f56-b3ce-4221-b098-ea7756476a7a/volumes" Feb 01 09:04:06 crc kubenswrapper[5127]: I0201 09:04:06.064853 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-vjdm5"] Feb 01 09:04:06 crc kubenswrapper[5127]: I0201 09:04:06.085288 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-vjdm5"] Feb 01 09:04:06 crc kubenswrapper[5127]: I0201 09:04:06.236008 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:04:06 crc kubenswrapper[5127]: E0201 09:04:06.236383 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:04:06 crc kubenswrapper[5127]: I0201 09:04:06.249399 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5802b898-6911-41ee-911f-c63f88207d79" path="/var/lib/kubelet/pods/5802b898-6911-41ee-911f-c63f88207d79/volumes" Feb 01 09:04:16 crc kubenswrapper[5127]: I0201 09:04:16.692699 5127 generic.go:334] "Generic (PLEG): container finished" podID="833a198f-222c-4ce9-a629-f1138fbd1fce" containerID="978804468bbc54fe25d88054c589ac8516b7af10d4852c7eea107c1fcfb042a7" exitCode=0 Feb 01 09:04:16 crc kubenswrapper[5127]: I0201 09:04:16.693457 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" event={"ID":"833a198f-222c-4ce9-a629-f1138fbd1fce","Type":"ContainerDied","Data":"978804468bbc54fe25d88054c589ac8516b7af10d4852c7eea107c1fcfb042a7"} Feb 01 09:04:17 crc kubenswrapper[5127]: I0201 09:04:17.235730 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:04:17 crc kubenswrapper[5127]: E0201 09:04:17.236001 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.229701 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.244741 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-inventory\") pod \"833a198f-222c-4ce9-a629-f1138fbd1fce\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.245019 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqfpp\" (UniqueName: \"kubernetes.io/projected/833a198f-222c-4ce9-a629-f1138fbd1fce-kube-api-access-xqfpp\") pod \"833a198f-222c-4ce9-a629-f1138fbd1fce\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.245335 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ceph\") pod \"833a198f-222c-4ce9-a629-f1138fbd1fce\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.245442 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ssh-key-openstack-cell1\") pod \"833a198f-222c-4ce9-a629-f1138fbd1fce\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.245607 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-tripleo-cleanup-combined-ca-bundle\") pod \"833a198f-222c-4ce9-a629-f1138fbd1fce\" (UID: \"833a198f-222c-4ce9-a629-f1138fbd1fce\") " Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.253012 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ceph" (OuterVolumeSpecName: "ceph") pod "833a198f-222c-4ce9-a629-f1138fbd1fce" (UID: "833a198f-222c-4ce9-a629-f1138fbd1fce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.263821 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "833a198f-222c-4ce9-a629-f1138fbd1fce" (UID: "833a198f-222c-4ce9-a629-f1138fbd1fce"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.269942 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833a198f-222c-4ce9-a629-f1138fbd1fce-kube-api-access-xqfpp" (OuterVolumeSpecName: "kube-api-access-xqfpp") pod "833a198f-222c-4ce9-a629-f1138fbd1fce" (UID: "833a198f-222c-4ce9-a629-f1138fbd1fce"). InnerVolumeSpecName "kube-api-access-xqfpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.275804 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-inventory" (OuterVolumeSpecName: "inventory") pod "833a198f-222c-4ce9-a629-f1138fbd1fce" (UID: "833a198f-222c-4ce9-a629-f1138fbd1fce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.294427 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "833a198f-222c-4ce9-a629-f1138fbd1fce" (UID: "833a198f-222c-4ce9-a629-f1138fbd1fce"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.348809 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.348843 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.348854 5127 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.348863 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/833a198f-222c-4ce9-a629-f1138fbd1fce-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.348873 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqfpp\" (UniqueName: \"kubernetes.io/projected/833a198f-222c-4ce9-a629-f1138fbd1fce-kube-api-access-xqfpp\") on node \"crc\" DevicePath \"\"" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.711889 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" event={"ID":"833a198f-222c-4ce9-a629-f1138fbd1fce","Type":"ContainerDied","Data":"be882afc2a1930f2c5c6bc567ffda85de59ea5cb273f8ac8840f97422025343c"} Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.711926 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be882afc2a1930f2c5c6bc567ffda85de59ea5cb273f8ac8840f97422025343c" Feb 01 09:04:18 crc kubenswrapper[5127]: I0201 09:04:18.711977 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.870719 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-pmwr5"] Feb 01 09:04:23 crc kubenswrapper[5127]: E0201 09:04:23.872002 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="extract-content" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.872030 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="extract-content" Feb 01 09:04:23 crc kubenswrapper[5127]: E0201 09:04:23.872061 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="registry-server" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.872073 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="registry-server" Feb 01 09:04:23 crc kubenswrapper[5127]: E0201 09:04:23.872119 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="extract-utilities" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.872133 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="extract-utilities" Feb 01 09:04:23 crc kubenswrapper[5127]: E0201 09:04:23.872159 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833a198f-222c-4ce9-a629-f1138fbd1fce" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.872174 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="833a198f-222c-4ce9-a629-f1138fbd1fce" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.872472 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="833a198f-222c-4ce9-a629-f1138fbd1fce" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.872528 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d63f4c0-d20d-4f41-b5cd-cf5941c79d65" containerName="registry-server" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.873725 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.878670 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.879273 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.879407 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.879279 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.887542 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-pmwr5"] Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.909687 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-wkc7q"] Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.911440 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.917287 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.917331 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.950546 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-wkc7q"] Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.971523 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8b6h\" (UniqueName: \"kubernetes.io/projected/0cd979e7-d370-42d4-a165-9c0792d98a4d-kube-api-access-k8b6h\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.971702 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.971879 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.971908 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-inventory\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.971941 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ceph\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.972004 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64r6d\" (UniqueName: \"kubernetes.io/projected/5f10050c-2269-4866-8d6d-70ebc730eca3-kube-api-access-64r6d\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.972034 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-inventory\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.972049 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:23 crc kubenswrapper[5127]: I0201 09:04:23.972105 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073635 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073773 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073807 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-inventory\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073839 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ceph\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073871 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64r6d\" (UniqueName: \"kubernetes.io/projected/5f10050c-2269-4866-8d6d-70ebc730eca3-kube-api-access-64r6d\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073903 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-inventory\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073928 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.073964 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.074012 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8b6h\" (UniqueName: \"kubernetes.io/projected/0cd979e7-d370-42d4-a165-9c0792d98a4d-kube-api-access-k8b6h\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.080522 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ceph\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.080968 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.080989 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.081425 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-inventory\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.086482 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.086723 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-inventory\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.087193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.093578 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64r6d\" (UniqueName: \"kubernetes.io/projected/5f10050c-2269-4866-8d6d-70ebc730eca3-kube-api-access-64r6d\") pod \"bootstrap-openstack-openstack-networker-wkc7q\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.104222 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8b6h\" (UniqueName: \"kubernetes.io/projected/0cd979e7-d370-42d4-a165-9c0792d98a4d-kube-api-access-k8b6h\") pod \"bootstrap-openstack-openstack-cell1-pmwr5\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.209574 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.242128 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:04:24 crc kubenswrapper[5127]: I0201 09:04:24.874411 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-wkc7q"] Feb 01 09:04:25 crc kubenswrapper[5127]: I0201 09:04:25.414919 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-pmwr5"] Feb 01 09:04:25 crc kubenswrapper[5127]: W0201 09:04:25.418104 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cd979e7_d370_42d4_a165_9c0792d98a4d.slice/crio-210e5d0d26a4f7ba1188ae2e25065b3ee2fa947dd5ead829d3469ef3302a66c7 WatchSource:0}: Error finding container 210e5d0d26a4f7ba1188ae2e25065b3ee2fa947dd5ead829d3469ef3302a66c7: Status 404 returned error can't find the container with id 210e5d0d26a4f7ba1188ae2e25065b3ee2fa947dd5ead829d3469ef3302a66c7 Feb 01 09:04:25 crc kubenswrapper[5127]: I0201 09:04:25.798605 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" event={"ID":"5f10050c-2269-4866-8d6d-70ebc730eca3","Type":"ContainerStarted","Data":"7ca027387dfd2bcfe2e1a750e3370c5740b070b696cee1db952930846f5cde5a"} Feb 01 09:04:25 crc kubenswrapper[5127]: I0201 09:04:25.798947 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" event={"ID":"5f10050c-2269-4866-8d6d-70ebc730eca3","Type":"ContainerStarted","Data":"6697c77f96e61048faa187a011b1fe21f751901d4ab5c26b907b0a5b06d18689"} Feb 01 09:04:25 crc kubenswrapper[5127]: I0201 09:04:25.801258 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" event={"ID":"0cd979e7-d370-42d4-a165-9c0792d98a4d","Type":"ContainerStarted","Data":"210e5d0d26a4f7ba1188ae2e25065b3ee2fa947dd5ead829d3469ef3302a66c7"} Feb 01 09:04:26 crc kubenswrapper[5127]: I0201 09:04:26.815313 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" event={"ID":"0cd979e7-d370-42d4-a165-9c0792d98a4d","Type":"ContainerStarted","Data":"426a7c4ec8a42f119ef8c500f223c954b00586bcf04d036c87933cfaadd09e64"} Feb 01 09:04:26 crc kubenswrapper[5127]: I0201 09:04:26.842504 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" podStartSLOduration=3.353096167 podStartE2EDuration="3.842476406s" podCreationTimestamp="2026-02-01 09:04:23 +0000 UTC" firstStartedPulling="2026-02-01 09:04:25.421741586 +0000 UTC m=+8215.907643949" lastFinishedPulling="2026-02-01 09:04:25.911121825 +0000 UTC m=+8216.397024188" observedRunningTime="2026-02-01 09:04:26.838642122 +0000 UTC m=+8217.324544505" watchObservedRunningTime="2026-02-01 09:04:26.842476406 +0000 UTC m=+8217.328378809" Feb 01 09:04:26 crc kubenswrapper[5127]: I0201 09:04:26.849226 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" podStartSLOduration=3.237411663 podStartE2EDuration="3.849200335s" podCreationTimestamp="2026-02-01 09:04:23 +0000 UTC" firstStartedPulling="2026-02-01 09:04:24.882527485 +0000 UTC m=+8215.368429848" lastFinishedPulling="2026-02-01 09:04:25.494316147 +0000 UTC m=+8215.980218520" observedRunningTime="2026-02-01 09:04:25.832718268 +0000 UTC m=+8216.318620631" watchObservedRunningTime="2026-02-01 09:04:26.849200335 +0000 UTC m=+8217.335102738" Feb 01 09:04:29 crc kubenswrapper[5127]: I0201 09:04:29.236695 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:04:29 crc kubenswrapper[5127]: E0201 09:04:29.237562 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:04:44 crc kubenswrapper[5127]: I0201 09:04:44.235395 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:04:44 crc kubenswrapper[5127]: E0201 09:04:44.236216 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:04:48 crc kubenswrapper[5127]: I0201 09:04:48.640080 5127 scope.go:117] "RemoveContainer" containerID="41118b7e5a1f623763847f62d37dababa60c350faf1cb65916f3180f27d57dd6" Feb 01 09:04:48 crc kubenswrapper[5127]: I0201 09:04:48.695226 5127 scope.go:117] "RemoveContainer" containerID="4dabf030ade965dc040c3eebb7a9945d7443c99bb1f39990a449baa478fe0d05" Feb 01 09:04:48 crc kubenswrapper[5127]: I0201 09:04:48.733784 5127 scope.go:117] "RemoveContainer" containerID="f67e806745fad421813caa3beb970830980e8f74d85e3ccd70e669302975d63c" Feb 01 09:04:58 crc kubenswrapper[5127]: I0201 09:04:58.236976 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:04:58 crc kubenswrapper[5127]: E0201 09:04:58.238037 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:05:11 crc kubenswrapper[5127]: I0201 09:05:11.236382 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:05:11 crc kubenswrapper[5127]: E0201 09:05:11.239057 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:05:25 crc kubenswrapper[5127]: I0201 09:05:25.237131 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:05:25 crc kubenswrapper[5127]: E0201 09:05:25.238381 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:05:36 crc kubenswrapper[5127]: I0201 09:05:36.235688 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:05:36 crc kubenswrapper[5127]: E0201 09:05:36.236955 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:05:47 crc kubenswrapper[5127]: I0201 09:05:47.236645 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:05:47 crc kubenswrapper[5127]: E0201 09:05:47.239564 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:06:01 crc kubenswrapper[5127]: I0201 09:06:01.250539 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:06:01 crc kubenswrapper[5127]: E0201 09:06:01.252472 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:06:12 crc kubenswrapper[5127]: I0201 09:06:12.235915 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:06:12 crc kubenswrapper[5127]: E0201 09:06:12.236806 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:06:27 crc kubenswrapper[5127]: I0201 09:06:27.236536 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:06:27 crc kubenswrapper[5127]: E0201 09:06:27.237803 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:06:38 crc kubenswrapper[5127]: I0201 09:06:38.240372 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:06:38 crc kubenswrapper[5127]: E0201 09:06:38.241413 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:06:51 crc kubenswrapper[5127]: I0201 09:06:51.235813 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:06:51 crc kubenswrapper[5127]: E0201 09:06:51.237087 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:07:04 crc kubenswrapper[5127]: I0201 09:07:04.237489 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:07:04 crc kubenswrapper[5127]: E0201 09:07:04.238423 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:07:15 crc kubenswrapper[5127]: I0201 09:07:15.235529 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:07:15 crc kubenswrapper[5127]: E0201 09:07:15.236464 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:07:20 crc kubenswrapper[5127]: I0201 09:07:20.727777 5127 generic.go:334] "Generic (PLEG): container finished" podID="0cd979e7-d370-42d4-a165-9c0792d98a4d" containerID="426a7c4ec8a42f119ef8c500f223c954b00586bcf04d036c87933cfaadd09e64" exitCode=0 Feb 01 09:07:20 crc kubenswrapper[5127]: I0201 09:07:20.727874 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" event={"ID":"0cd979e7-d370-42d4-a165-9c0792d98a4d","Type":"ContainerDied","Data":"426a7c4ec8a42f119ef8c500f223c954b00586bcf04d036c87933cfaadd09e64"} Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.284995 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.467418 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ceph\") pod \"0cd979e7-d370-42d4-a165-9c0792d98a4d\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.467556 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ssh-key-openstack-cell1\") pod \"0cd979e7-d370-42d4-a165-9c0792d98a4d\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.467601 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-inventory\") pod \"0cd979e7-d370-42d4-a165-9c0792d98a4d\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.467742 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-bootstrap-combined-ca-bundle\") pod \"0cd979e7-d370-42d4-a165-9c0792d98a4d\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.467763 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8b6h\" (UniqueName: \"kubernetes.io/projected/0cd979e7-d370-42d4-a165-9c0792d98a4d-kube-api-access-k8b6h\") pod \"0cd979e7-d370-42d4-a165-9c0792d98a4d\" (UID: \"0cd979e7-d370-42d4-a165-9c0792d98a4d\") " Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.474428 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0cd979e7-d370-42d4-a165-9c0792d98a4d" (UID: "0cd979e7-d370-42d4-a165-9c0792d98a4d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.474868 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ceph" (OuterVolumeSpecName: "ceph") pod "0cd979e7-d370-42d4-a165-9c0792d98a4d" (UID: "0cd979e7-d370-42d4-a165-9c0792d98a4d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.484958 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd979e7-d370-42d4-a165-9c0792d98a4d-kube-api-access-k8b6h" (OuterVolumeSpecName: "kube-api-access-k8b6h") pod "0cd979e7-d370-42d4-a165-9c0792d98a4d" (UID: "0cd979e7-d370-42d4-a165-9c0792d98a4d"). InnerVolumeSpecName "kube-api-access-k8b6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.506045 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0cd979e7-d370-42d4-a165-9c0792d98a4d" (UID: "0cd979e7-d370-42d4-a165-9c0792d98a4d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.522565 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-inventory" (OuterVolumeSpecName: "inventory") pod "0cd979e7-d370-42d4-a165-9c0792d98a4d" (UID: "0cd979e7-d370-42d4-a165-9c0792d98a4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.570540 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.570778 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.570796 5127 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.570968 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8b6h\" (UniqueName: \"kubernetes.io/projected/0cd979e7-d370-42d4-a165-9c0792d98a4d-kube-api-access-k8b6h\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.570986 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0cd979e7-d370-42d4-a165-9c0792d98a4d-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.750231 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" event={"ID":"0cd979e7-d370-42d4-a165-9c0792d98a4d","Type":"ContainerDied","Data":"210e5d0d26a4f7ba1188ae2e25065b3ee2fa947dd5ead829d3469ef3302a66c7"} Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.750562 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="210e5d0d26a4f7ba1188ae2e25065b3ee2fa947dd5ead829d3469ef3302a66c7" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.750292 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-pmwr5" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.890740 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-vxnlw"] Feb 01 09:07:22 crc kubenswrapper[5127]: E0201 09:07:22.891286 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd979e7-d370-42d4-a165-9c0792d98a4d" containerName="bootstrap-openstack-openstack-cell1" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.891312 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd979e7-d370-42d4-a165-9c0792d98a4d" containerName="bootstrap-openstack-openstack-cell1" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.891672 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd979e7-d370-42d4-a165-9c0792d98a4d" containerName="bootstrap-openstack-openstack-cell1" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.892669 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.897195 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.897846 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:07:22 crc kubenswrapper[5127]: I0201 09:07:22.923417 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-vxnlw"] Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.082191 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.082258 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-inventory\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.082298 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ceph\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.083049 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r84gc\" (UniqueName: \"kubernetes.io/projected/283b25af-dc74-4623-9e82-90c0a32e6fc5-kube-api-access-r84gc\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.186206 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.186296 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-inventory\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.186354 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ceph\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.186419 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r84gc\" (UniqueName: \"kubernetes.io/projected/283b25af-dc74-4623-9e82-90c0a32e6fc5-kube-api-access-r84gc\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.191140 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-inventory\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.196128 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.203497 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ceph\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.207515 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r84gc\" (UniqueName: \"kubernetes.io/projected/283b25af-dc74-4623-9e82-90c0a32e6fc5-kube-api-access-r84gc\") pod \"download-cache-openstack-openstack-cell1-vxnlw\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.220869 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.807156 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-vxnlw"] Feb 01 09:07:23 crc kubenswrapper[5127]: W0201 09:07:23.814019 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod283b25af_dc74_4623_9e82_90c0a32e6fc5.slice/crio-e5ff9bf277e50c496c91bda7321d03557fffc4088a490db7d35052a47b04310f WatchSource:0}: Error finding container e5ff9bf277e50c496c91bda7321d03557fffc4088a490db7d35052a47b04310f: Status 404 returned error can't find the container with id e5ff9bf277e50c496c91bda7321d03557fffc4088a490db7d35052a47b04310f Feb 01 09:07:23 crc kubenswrapper[5127]: I0201 09:07:23.816767 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:07:24 crc kubenswrapper[5127]: I0201 09:07:24.771623 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" event={"ID":"283b25af-dc74-4623-9e82-90c0a32e6fc5","Type":"ContainerStarted","Data":"8806d6398a61bbf36443707aa18f06c970e33911e6d795f84a5878e233911eaf"} Feb 01 09:07:24 crc kubenswrapper[5127]: I0201 09:07:24.771954 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" event={"ID":"283b25af-dc74-4623-9e82-90c0a32e6fc5","Type":"ContainerStarted","Data":"e5ff9bf277e50c496c91bda7321d03557fffc4088a490db7d35052a47b04310f"} Feb 01 09:07:24 crc kubenswrapper[5127]: I0201 09:07:24.797811 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" podStartSLOduration=2.400081503 podStartE2EDuration="2.79778553s" podCreationTimestamp="2026-02-01 09:07:22 +0000 UTC" firstStartedPulling="2026-02-01 09:07:23.816493654 +0000 UTC m=+8394.302396027" lastFinishedPulling="2026-02-01 09:07:24.214197681 +0000 UTC m=+8394.700100054" observedRunningTime="2026-02-01 09:07:24.790068754 +0000 UTC m=+8395.275971157" watchObservedRunningTime="2026-02-01 09:07:24.79778553 +0000 UTC m=+8395.283687933" Feb 01 09:07:25 crc kubenswrapper[5127]: I0201 09:07:25.782890 5127 generic.go:334] "Generic (PLEG): container finished" podID="5f10050c-2269-4866-8d6d-70ebc730eca3" containerID="7ca027387dfd2bcfe2e1a750e3370c5740b070b696cee1db952930846f5cde5a" exitCode=0 Feb 01 09:07:25 crc kubenswrapper[5127]: I0201 09:07:25.783004 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" event={"ID":"5f10050c-2269-4866-8d6d-70ebc730eca3","Type":"ContainerDied","Data":"7ca027387dfd2bcfe2e1a750e3370c5740b070b696cee1db952930846f5cde5a"} Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.236165 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:07:27 crc kubenswrapper[5127]: E0201 09:07:27.236878 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.334213 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.381115 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-inventory\") pod \"5f10050c-2269-4866-8d6d-70ebc730eca3\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.381177 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64r6d\" (UniqueName: \"kubernetes.io/projected/5f10050c-2269-4866-8d6d-70ebc730eca3-kube-api-access-64r6d\") pod \"5f10050c-2269-4866-8d6d-70ebc730eca3\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.387500 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f10050c-2269-4866-8d6d-70ebc730eca3-kube-api-access-64r6d" (OuterVolumeSpecName: "kube-api-access-64r6d") pod "5f10050c-2269-4866-8d6d-70ebc730eca3" (UID: "5f10050c-2269-4866-8d6d-70ebc730eca3"). InnerVolumeSpecName "kube-api-access-64r6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.420228 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-inventory" (OuterVolumeSpecName: "inventory") pod "5f10050c-2269-4866-8d6d-70ebc730eca3" (UID: "5f10050c-2269-4866-8d6d-70ebc730eca3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.483258 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-ssh-key-openstack-networker\") pod \"5f10050c-2269-4866-8d6d-70ebc730eca3\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.483542 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-bootstrap-combined-ca-bundle\") pod \"5f10050c-2269-4866-8d6d-70ebc730eca3\" (UID: \"5f10050c-2269-4866-8d6d-70ebc730eca3\") " Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.484595 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.484620 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64r6d\" (UniqueName: \"kubernetes.io/projected/5f10050c-2269-4866-8d6d-70ebc730eca3-kube-api-access-64r6d\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.486394 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5f10050c-2269-4866-8d6d-70ebc730eca3" (UID: "5f10050c-2269-4866-8d6d-70ebc730eca3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.518053 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "5f10050c-2269-4866-8d6d-70ebc730eca3" (UID: "5f10050c-2269-4866-8d6d-70ebc730eca3"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.587685 5127 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.587743 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5f10050c-2269-4866-8d6d-70ebc730eca3-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.809198 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" event={"ID":"5f10050c-2269-4866-8d6d-70ebc730eca3","Type":"ContainerDied","Data":"6697c77f96e61048faa187a011b1fe21f751901d4ab5c26b907b0a5b06d18689"} Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.809241 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6697c77f96e61048faa187a011b1fe21f751901d4ab5c26b907b0a5b06d18689" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.809301 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-wkc7q" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.918703 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-networker-mm8b7"] Feb 01 09:07:27 crc kubenswrapper[5127]: E0201 09:07:27.919485 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f10050c-2269-4866-8d6d-70ebc730eca3" containerName="bootstrap-openstack-openstack-networker" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.919512 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f10050c-2269-4866-8d6d-70ebc730eca3" containerName="bootstrap-openstack-openstack-networker" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.919757 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f10050c-2269-4866-8d6d-70ebc730eca3" containerName="bootstrap-openstack-openstack-networker" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.920518 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.924026 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.924329 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.933912 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-mm8b7"] Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.996673 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qjg\" (UniqueName: \"kubernetes.io/projected/896963d4-1a35-41cc-81b4-d5695874f82a-kube-api-access-c6qjg\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.996778 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-inventory\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:27 crc kubenswrapper[5127]: I0201 09:07:27.996880 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.098076 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.098212 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qjg\" (UniqueName: \"kubernetes.io/projected/896963d4-1a35-41cc-81b4-d5695874f82a-kube-api-access-c6qjg\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.098312 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-inventory\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.102163 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-inventory\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.102312 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.114344 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qjg\" (UniqueName: \"kubernetes.io/projected/896963d4-1a35-41cc-81b4-d5695874f82a-kube-api-access-c6qjg\") pod \"download-cache-openstack-openstack-networker-mm8b7\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.244145 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:07:28 crc kubenswrapper[5127]: I0201 09:07:28.804574 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-mm8b7"] Feb 01 09:07:28 crc kubenswrapper[5127]: W0201 09:07:28.817191 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod896963d4_1a35_41cc_81b4_d5695874f82a.slice/crio-d35b4094d0daed6a722635d8a1e486b9784d8214604047b81bf74bc4b82c86cc WatchSource:0}: Error finding container d35b4094d0daed6a722635d8a1e486b9784d8214604047b81bf74bc4b82c86cc: Status 404 returned error can't find the container with id d35b4094d0daed6a722635d8a1e486b9784d8214604047b81bf74bc4b82c86cc Feb 01 09:07:29 crc kubenswrapper[5127]: I0201 09:07:29.831643 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" event={"ID":"896963d4-1a35-41cc-81b4-d5695874f82a","Type":"ContainerStarted","Data":"5d597004947f686c2be1f42d0038ce7be7f34a096e56213b318d89c9fde4606d"} Feb 01 09:07:29 crc kubenswrapper[5127]: I0201 09:07:29.831990 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" event={"ID":"896963d4-1a35-41cc-81b4-d5695874f82a","Type":"ContainerStarted","Data":"d35b4094d0daed6a722635d8a1e486b9784d8214604047b81bf74bc4b82c86cc"} Feb 01 09:07:29 crc kubenswrapper[5127]: I0201 09:07:29.857241 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" podStartSLOduration=2.450089309 podStartE2EDuration="2.857220318s" podCreationTimestamp="2026-02-01 09:07:27 +0000 UTC" firstStartedPulling="2026-02-01 09:07:28.820423978 +0000 UTC m=+8399.306326351" lastFinishedPulling="2026-02-01 09:07:29.227555007 +0000 UTC m=+8399.713457360" observedRunningTime="2026-02-01 09:07:29.854154336 +0000 UTC m=+8400.340056729" watchObservedRunningTime="2026-02-01 09:07:29.857220318 +0000 UTC m=+8400.343122691" Feb 01 09:07:41 crc kubenswrapper[5127]: I0201 09:07:41.237010 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:07:41 crc kubenswrapper[5127]: E0201 09:07:41.238696 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:07:52 crc kubenswrapper[5127]: I0201 09:07:52.236823 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:07:52 crc kubenswrapper[5127]: E0201 09:07:52.238661 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:08:06 crc kubenswrapper[5127]: I0201 09:08:06.236003 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:08:06 crc kubenswrapper[5127]: E0201 09:08:06.237036 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:08:19 crc kubenswrapper[5127]: I0201 09:08:19.235628 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:08:19 crc kubenswrapper[5127]: E0201 09:08:19.236334 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:08:31 crc kubenswrapper[5127]: I0201 09:08:31.236317 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:08:31 crc kubenswrapper[5127]: E0201 09:08:31.237390 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:08:36 crc kubenswrapper[5127]: I0201 09:08:36.642045 5127 generic.go:334] "Generic (PLEG): container finished" podID="896963d4-1a35-41cc-81b4-d5695874f82a" containerID="5d597004947f686c2be1f42d0038ce7be7f34a096e56213b318d89c9fde4606d" exitCode=0 Feb 01 09:08:36 crc kubenswrapper[5127]: I0201 09:08:36.642138 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" event={"ID":"896963d4-1a35-41cc-81b4-d5695874f82a","Type":"ContainerDied","Data":"5d597004947f686c2be1f42d0038ce7be7f34a096e56213b318d89c9fde4606d"} Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.171593 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.346332 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6qjg\" (UniqueName: \"kubernetes.io/projected/896963d4-1a35-41cc-81b4-d5695874f82a-kube-api-access-c6qjg\") pod \"896963d4-1a35-41cc-81b4-d5695874f82a\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.346604 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-inventory\") pod \"896963d4-1a35-41cc-81b4-d5695874f82a\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.346657 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-ssh-key-openstack-networker\") pod \"896963d4-1a35-41cc-81b4-d5695874f82a\" (UID: \"896963d4-1a35-41cc-81b4-d5695874f82a\") " Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.364052 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896963d4-1a35-41cc-81b4-d5695874f82a-kube-api-access-c6qjg" (OuterVolumeSpecName: "kube-api-access-c6qjg") pod "896963d4-1a35-41cc-81b4-d5695874f82a" (UID: "896963d4-1a35-41cc-81b4-d5695874f82a"). InnerVolumeSpecName "kube-api-access-c6qjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.374222 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-inventory" (OuterVolumeSpecName: "inventory") pod "896963d4-1a35-41cc-81b4-d5695874f82a" (UID: "896963d4-1a35-41cc-81b4-d5695874f82a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.396756 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "896963d4-1a35-41cc-81b4-d5695874f82a" (UID: "896963d4-1a35-41cc-81b4-d5695874f82a"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.449115 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.449149 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/896963d4-1a35-41cc-81b4-d5695874f82a-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.449160 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6qjg\" (UniqueName: \"kubernetes.io/projected/896963d4-1a35-41cc-81b4-d5695874f82a-kube-api-access-c6qjg\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.665466 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" event={"ID":"896963d4-1a35-41cc-81b4-d5695874f82a","Type":"ContainerDied","Data":"d35b4094d0daed6a722635d8a1e486b9784d8214604047b81bf74bc4b82c86cc"} Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.665506 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d35b4094d0daed6a722635d8a1e486b9784d8214604047b81bf74bc4b82c86cc" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.665521 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-mm8b7" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.810930 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-networker-9mfdv"] Feb 01 09:08:38 crc kubenswrapper[5127]: E0201 09:08:38.811518 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896963d4-1a35-41cc-81b4-d5695874f82a" containerName="download-cache-openstack-openstack-networker" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.811541 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="896963d4-1a35-41cc-81b4-d5695874f82a" containerName="download-cache-openstack-openstack-networker" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.811797 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="896963d4-1a35-41cc-81b4-d5695874f82a" containerName="download-cache-openstack-openstack-networker" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.812678 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.815313 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.815441 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.831523 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-9mfdv"] Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.962133 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-inventory\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.962493 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:38 crc kubenswrapper[5127]: I0201 09:08:38.963023 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxk5n\" (UniqueName: \"kubernetes.io/projected/fbcbec86-272a-408e-9f57-5478b28fe0ed-kube-api-access-mxk5n\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.065576 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxk5n\" (UniqueName: \"kubernetes.io/projected/fbcbec86-272a-408e-9f57-5478b28fe0ed-kube-api-access-mxk5n\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.065688 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-inventory\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.065840 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.076514 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-inventory\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.076602 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.090209 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxk5n\" (UniqueName: \"kubernetes.io/projected/fbcbec86-272a-408e-9f57-5478b28fe0ed-kube-api-access-mxk5n\") pod \"configure-network-openstack-openstack-networker-9mfdv\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.145371 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:08:39 crc kubenswrapper[5127]: I0201 09:08:39.716910 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-9mfdv"] Feb 01 09:08:40 crc kubenswrapper[5127]: I0201 09:08:40.690025 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" event={"ID":"fbcbec86-272a-408e-9f57-5478b28fe0ed","Type":"ContainerStarted","Data":"5c7cf4eaed1bc4f4a7dc9fd90da0237ce0910171dcd67b999ddfbc4de78ee082"} Feb 01 09:08:40 crc kubenswrapper[5127]: I0201 09:08:40.690519 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" event={"ID":"fbcbec86-272a-408e-9f57-5478b28fe0ed","Type":"ContainerStarted","Data":"1b40a6882080fe816872f51753f4df4d38bea0a92f3cfdd8efa241bba099ad3f"} Feb 01 09:08:40 crc kubenswrapper[5127]: I0201 09:08:40.717142 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" podStartSLOduration=2.133758383 podStartE2EDuration="2.717117826s" podCreationTimestamp="2026-02-01 09:08:38 +0000 UTC" firstStartedPulling="2026-02-01 09:08:39.724327673 +0000 UTC m=+8470.210230066" lastFinishedPulling="2026-02-01 09:08:40.307687106 +0000 UTC m=+8470.793589509" observedRunningTime="2026-02-01 09:08:40.715267297 +0000 UTC m=+8471.201169700" watchObservedRunningTime="2026-02-01 09:08:40.717117826 +0000 UTC m=+8471.203020229" Feb 01 09:08:46 crc kubenswrapper[5127]: I0201 09:08:46.236469 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:08:46 crc kubenswrapper[5127]: I0201 09:08:46.751995 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"c485e70a9873e3942903d5d8141fd5764a60f96cd54d7e2f63e9fc092f3df951"} Feb 01 09:08:55 crc kubenswrapper[5127]: I0201 09:08:55.842560 5127 generic.go:334] "Generic (PLEG): container finished" podID="283b25af-dc74-4623-9e82-90c0a32e6fc5" containerID="8806d6398a61bbf36443707aa18f06c970e33911e6d795f84a5878e233911eaf" exitCode=0 Feb 01 09:08:55 crc kubenswrapper[5127]: I0201 09:08:55.842742 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" event={"ID":"283b25af-dc74-4623-9e82-90c0a32e6fc5","Type":"ContainerDied","Data":"8806d6398a61bbf36443707aa18f06c970e33911e6d795f84a5878e233911eaf"} Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.341532 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.498729 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ssh-key-openstack-cell1\") pod \"283b25af-dc74-4623-9e82-90c0a32e6fc5\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.498763 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-inventory\") pod \"283b25af-dc74-4623-9e82-90c0a32e6fc5\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.499681 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r84gc\" (UniqueName: \"kubernetes.io/projected/283b25af-dc74-4623-9e82-90c0a32e6fc5-kube-api-access-r84gc\") pod \"283b25af-dc74-4623-9e82-90c0a32e6fc5\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.499982 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ceph\") pod \"283b25af-dc74-4623-9e82-90c0a32e6fc5\" (UID: \"283b25af-dc74-4623-9e82-90c0a32e6fc5\") " Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.508014 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/283b25af-dc74-4623-9e82-90c0a32e6fc5-kube-api-access-r84gc" (OuterVolumeSpecName: "kube-api-access-r84gc") pod "283b25af-dc74-4623-9e82-90c0a32e6fc5" (UID: "283b25af-dc74-4623-9e82-90c0a32e6fc5"). InnerVolumeSpecName "kube-api-access-r84gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.510801 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ceph" (OuterVolumeSpecName: "ceph") pod "283b25af-dc74-4623-9e82-90c0a32e6fc5" (UID: "283b25af-dc74-4623-9e82-90c0a32e6fc5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.529432 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "283b25af-dc74-4623-9e82-90c0a32e6fc5" (UID: "283b25af-dc74-4623-9e82-90c0a32e6fc5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.538375 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-inventory" (OuterVolumeSpecName: "inventory") pod "283b25af-dc74-4623-9e82-90c0a32e6fc5" (UID: "283b25af-dc74-4623-9e82-90c0a32e6fc5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.603379 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.603455 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.603470 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/283b25af-dc74-4623-9e82-90c0a32e6fc5-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.603482 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r84gc\" (UniqueName: \"kubernetes.io/projected/283b25af-dc74-4623-9e82-90c0a32e6fc5-kube-api-access-r84gc\") on node \"crc\" DevicePath \"\"" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.882565 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" event={"ID":"283b25af-dc74-4623-9e82-90c0a32e6fc5","Type":"ContainerDied","Data":"e5ff9bf277e50c496c91bda7321d03557fffc4088a490db7d35052a47b04310f"} Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.882657 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5ff9bf277e50c496c91bda7321d03557fffc4088a490db7d35052a47b04310f" Feb 01 09:08:57 crc kubenswrapper[5127]: I0201 09:08:57.882746 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-vxnlw" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.008681 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-bm985"] Feb 01 09:08:58 crc kubenswrapper[5127]: E0201 09:08:58.009141 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="283b25af-dc74-4623-9e82-90c0a32e6fc5" containerName="download-cache-openstack-openstack-cell1" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.009158 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="283b25af-dc74-4623-9e82-90c0a32e6fc5" containerName="download-cache-openstack-openstack-cell1" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.009364 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="283b25af-dc74-4623-9e82-90c0a32e6fc5" containerName="download-cache-openstack-openstack-cell1" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.010072 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.014688 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm2ml\" (UniqueName: \"kubernetes.io/projected/e605bdd8-d806-44ed-a832-fc7917f53089-kube-api-access-qm2ml\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.014828 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-inventory\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.014894 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.014943 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ceph\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.022052 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.022999 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.060789 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-bm985"] Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.128956 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ceph\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.129161 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm2ml\" (UniqueName: \"kubernetes.io/projected/e605bdd8-d806-44ed-a832-fc7917f53089-kube-api-access-qm2ml\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.129290 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-inventory\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.129350 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.138087 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-inventory\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.157239 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ceph\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.164823 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm2ml\" (UniqueName: \"kubernetes.io/projected/e605bdd8-d806-44ed-a832-fc7917f53089-kube-api-access-qm2ml\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.179268 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-bm985\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.373109 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:08:58 crc kubenswrapper[5127]: E0201 09:08:58.419722 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod283b25af_dc74_4623_9e82_90c0a32e6fc5.slice/crio-e5ff9bf277e50c496c91bda7321d03557fffc4088a490db7d35052a47b04310f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod283b25af_dc74_4623_9e82_90c0a32e6fc5.slice\": RecentStats: unable to find data in memory cache]" Feb 01 09:08:58 crc kubenswrapper[5127]: I0201 09:08:58.998014 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-bm985"] Feb 01 09:08:59 crc kubenswrapper[5127]: W0201 09:08:59.010302 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode605bdd8_d806_44ed_a832_fc7917f53089.slice/crio-95d359a05a929b2f632f4b61829ec11b624d00bf499a90fb75ac6323eb497e9e WatchSource:0}: Error finding container 95d359a05a929b2f632f4b61829ec11b624d00bf499a90fb75ac6323eb497e9e: Status 404 returned error can't find the container with id 95d359a05a929b2f632f4b61829ec11b624d00bf499a90fb75ac6323eb497e9e Feb 01 09:08:59 crc kubenswrapper[5127]: I0201 09:08:59.911125 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-bm985" event={"ID":"e605bdd8-d806-44ed-a832-fc7917f53089","Type":"ContainerStarted","Data":"95d359a05a929b2f632f4b61829ec11b624d00bf499a90fb75ac6323eb497e9e"} Feb 01 09:09:00 crc kubenswrapper[5127]: I0201 09:09:00.924392 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-bm985" event={"ID":"e605bdd8-d806-44ed-a832-fc7917f53089","Type":"ContainerStarted","Data":"8e613c8a170c4c15bea74fd9e1d5a46e78fca1d577b2cdf1351f07b886cad987"} Feb 01 09:09:00 crc kubenswrapper[5127]: I0201 09:09:00.950255 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-bm985" podStartSLOduration=3.284313306 podStartE2EDuration="3.950233786s" podCreationTimestamp="2026-02-01 09:08:57 +0000 UTC" firstStartedPulling="2026-02-01 09:08:59.015336526 +0000 UTC m=+8489.501238909" lastFinishedPulling="2026-02-01 09:08:59.681257026 +0000 UTC m=+8490.167159389" observedRunningTime="2026-02-01 09:09:00.948420177 +0000 UTC m=+8491.434322550" watchObservedRunningTime="2026-02-01 09:09:00.950233786 +0000 UTC m=+8491.436136159" Feb 01 09:09:00 crc kubenswrapper[5127]: I0201 09:09:00.993102 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mswnv"] Feb 01 09:09:00 crc kubenswrapper[5127]: I0201 09:09:00.996541 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.006661 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mswnv"] Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.122048 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmd2l\" (UniqueName: \"kubernetes.io/projected/ed1f1eba-207c-4955-8b0a-2372e7a846d7-kube-api-access-bmd2l\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.122177 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-catalog-content\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.122219 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-utilities\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.184517 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mgp9d"] Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.188328 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.199086 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgp9d"] Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.224270 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-catalog-content\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.224644 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-utilities\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.224798 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mtws\" (UniqueName: \"kubernetes.io/projected/b6b01070-51fe-4f94-beff-8f448f55b8f9-kube-api-access-6mtws\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.224961 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-utilities\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.225070 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmd2l\" (UniqueName: \"kubernetes.io/projected/ed1f1eba-207c-4955-8b0a-2372e7a846d7-kube-api-access-bmd2l\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.225165 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-catalog-content\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.225868 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-catalog-content\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.226237 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-utilities\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.261341 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmd2l\" (UniqueName: \"kubernetes.io/projected/ed1f1eba-207c-4955-8b0a-2372e7a846d7-kube-api-access-bmd2l\") pod \"community-operators-mswnv\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.327660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mtws\" (UniqueName: \"kubernetes.io/projected/b6b01070-51fe-4f94-beff-8f448f55b8f9-kube-api-access-6mtws\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.328080 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-utilities\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.328196 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-catalog-content\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.329208 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-utilities\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.329878 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-catalog-content\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.352160 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mtws\" (UniqueName: \"kubernetes.io/projected/b6b01070-51fe-4f94-beff-8f448f55b8f9-kube-api-access-6mtws\") pod \"certified-operators-mgp9d\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.353956 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:01 crc kubenswrapper[5127]: I0201 09:09:01.518423 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.005906 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mswnv"] Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.281459 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mgp9d"] Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.948434 5127 generic.go:334] "Generic (PLEG): container finished" podID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerID="ad7082bb0e318f5e3f978904a0431bd08056073a255f74c1de14c0e6704c7a3f" exitCode=0 Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.948481 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mswnv" event={"ID":"ed1f1eba-207c-4955-8b0a-2372e7a846d7","Type":"ContainerDied","Data":"ad7082bb0e318f5e3f978904a0431bd08056073a255f74c1de14c0e6704c7a3f"} Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.948809 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mswnv" event={"ID":"ed1f1eba-207c-4955-8b0a-2372e7a846d7","Type":"ContainerStarted","Data":"4579017e1c9779d087afa3e6b9e2d39ecf84952b45e668f120131ddd711e8dc6"} Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.950440 5127 generic.go:334] "Generic (PLEG): container finished" podID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerID="79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab" exitCode=0 Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.950483 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgp9d" event={"ID":"b6b01070-51fe-4f94-beff-8f448f55b8f9","Type":"ContainerDied","Data":"79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab"} Feb 01 09:09:02 crc kubenswrapper[5127]: I0201 09:09:02.950512 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgp9d" event={"ID":"b6b01070-51fe-4f94-beff-8f448f55b8f9","Type":"ContainerStarted","Data":"b95c0f5469a1e96c064ed6946d0daf4649dd9a7778d79af1a3fb9c4cf3237950"} Feb 01 09:09:03 crc kubenswrapper[5127]: I0201 09:09:03.965636 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mswnv" event={"ID":"ed1f1eba-207c-4955-8b0a-2372e7a846d7","Type":"ContainerStarted","Data":"50e74324c9fe92095481052e873631546ee7c6d4b8e299407a44e456a88b2a28"} Feb 01 09:09:03 crc kubenswrapper[5127]: I0201 09:09:03.973896 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgp9d" event={"ID":"b6b01070-51fe-4f94-beff-8f448f55b8f9","Type":"ContainerStarted","Data":"fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829"} Feb 01 09:09:03 crc kubenswrapper[5127]: I0201 09:09:03.986857 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8t6mf"] Feb 01 09:09:03 crc kubenswrapper[5127]: I0201 09:09:03.989313 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:03 crc kubenswrapper[5127]: I0201 09:09:03.997890 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t6mf"] Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.117445 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zhs2\" (UniqueName: \"kubernetes.io/projected/a75aee89-7e67-48c4-ab7e-b9376268685f-kube-api-access-2zhs2\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.117570 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-utilities\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.117651 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-catalog-content\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.219903 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-catalog-content\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.220007 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zhs2\" (UniqueName: \"kubernetes.io/projected/a75aee89-7e67-48c4-ab7e-b9376268685f-kube-api-access-2zhs2\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.220090 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-utilities\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.220523 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-utilities\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.220520 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-catalog-content\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.238524 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zhs2\" (UniqueName: \"kubernetes.io/projected/a75aee89-7e67-48c4-ab7e-b9376268685f-kube-api-access-2zhs2\") pod \"redhat-marketplace-8t6mf\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.318539 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.956505 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t6mf"] Feb 01 09:09:04 crc kubenswrapper[5127]: W0201 09:09:04.960293 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75aee89_7e67_48c4_ab7e_b9376268685f.slice/crio-58ded7d9fc31502704e10e51bfa804c0f7b310191999fe87fea7a416180ce39b WatchSource:0}: Error finding container 58ded7d9fc31502704e10e51bfa804c0f7b310191999fe87fea7a416180ce39b: Status 404 returned error can't find the container with id 58ded7d9fc31502704e10e51bfa804c0f7b310191999fe87fea7a416180ce39b Feb 01 09:09:04 crc kubenswrapper[5127]: I0201 09:09:04.983772 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t6mf" event={"ID":"a75aee89-7e67-48c4-ab7e-b9376268685f","Type":"ContainerStarted","Data":"58ded7d9fc31502704e10e51bfa804c0f7b310191999fe87fea7a416180ce39b"} Feb 01 09:09:05 crc kubenswrapper[5127]: I0201 09:09:05.996122 5127 generic.go:334] "Generic (PLEG): container finished" podID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerID="848422a89d50c6567cbad4fa8d06a3bdee99c88659e359ce809545d63f316f8f" exitCode=0 Feb 01 09:09:05 crc kubenswrapper[5127]: I0201 09:09:05.996206 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t6mf" event={"ID":"a75aee89-7e67-48c4-ab7e-b9376268685f","Type":"ContainerDied","Data":"848422a89d50c6567cbad4fa8d06a3bdee99c88659e359ce809545d63f316f8f"} Feb 01 09:09:07 crc kubenswrapper[5127]: I0201 09:09:07.009636 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t6mf" event={"ID":"a75aee89-7e67-48c4-ab7e-b9376268685f","Type":"ContainerStarted","Data":"4e75fb741a139554b167ebf4dc678480a2f8889cf4e97dbeda62457d91c69aba"} Feb 01 09:09:07 crc kubenswrapper[5127]: I0201 09:09:07.012278 5127 generic.go:334] "Generic (PLEG): container finished" podID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerID="50e74324c9fe92095481052e873631546ee7c6d4b8e299407a44e456a88b2a28" exitCode=0 Feb 01 09:09:07 crc kubenswrapper[5127]: I0201 09:09:07.012340 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mswnv" event={"ID":"ed1f1eba-207c-4955-8b0a-2372e7a846d7","Type":"ContainerDied","Data":"50e74324c9fe92095481052e873631546ee7c6d4b8e299407a44e456a88b2a28"} Feb 01 09:09:07 crc kubenswrapper[5127]: I0201 09:09:07.015541 5127 generic.go:334] "Generic (PLEG): container finished" podID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerID="fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829" exitCode=0 Feb 01 09:09:07 crc kubenswrapper[5127]: I0201 09:09:07.015614 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgp9d" event={"ID":"b6b01070-51fe-4f94-beff-8f448f55b8f9","Type":"ContainerDied","Data":"fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829"} Feb 01 09:09:08 crc kubenswrapper[5127]: I0201 09:09:08.037768 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mswnv" event={"ID":"ed1f1eba-207c-4955-8b0a-2372e7a846d7","Type":"ContainerStarted","Data":"d2f9d76fa3090b55b1804fdb56944b0c69bc414f621ca94a4eaad1d10b8d8104"} Feb 01 09:09:08 crc kubenswrapper[5127]: I0201 09:09:08.050069 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgp9d" event={"ID":"b6b01070-51fe-4f94-beff-8f448f55b8f9","Type":"ContainerStarted","Data":"3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668"} Feb 01 09:09:08 crc kubenswrapper[5127]: I0201 09:09:08.074009 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mswnv" podStartSLOduration=3.49658197 podStartE2EDuration="8.073984166s" podCreationTimestamp="2026-02-01 09:09:00 +0000 UTC" firstStartedPulling="2026-02-01 09:09:02.951546342 +0000 UTC m=+8493.437448705" lastFinishedPulling="2026-02-01 09:09:07.528948538 +0000 UTC m=+8498.014850901" observedRunningTime="2026-02-01 09:09:08.061421159 +0000 UTC m=+8498.547323522" watchObservedRunningTime="2026-02-01 09:09:08.073984166 +0000 UTC m=+8498.559886539" Feb 01 09:09:08 crc kubenswrapper[5127]: I0201 09:09:08.107099 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mgp9d" podStartSLOduration=2.619845096 podStartE2EDuration="7.107080251s" podCreationTimestamp="2026-02-01 09:09:01 +0000 UTC" firstStartedPulling="2026-02-01 09:09:02.95223279 +0000 UTC m=+8493.438135153" lastFinishedPulling="2026-02-01 09:09:07.439467945 +0000 UTC m=+8497.925370308" observedRunningTime="2026-02-01 09:09:08.093729424 +0000 UTC m=+8498.579631827" watchObservedRunningTime="2026-02-01 09:09:08.107080251 +0000 UTC m=+8498.592982614" Feb 01 09:09:09 crc kubenswrapper[5127]: I0201 09:09:09.061574 5127 generic.go:334] "Generic (PLEG): container finished" podID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerID="4e75fb741a139554b167ebf4dc678480a2f8889cf4e97dbeda62457d91c69aba" exitCode=0 Feb 01 09:09:09 crc kubenswrapper[5127]: I0201 09:09:09.061640 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t6mf" event={"ID":"a75aee89-7e67-48c4-ab7e-b9376268685f","Type":"ContainerDied","Data":"4e75fb741a139554b167ebf4dc678480a2f8889cf4e97dbeda62457d91c69aba"} Feb 01 09:09:10 crc kubenswrapper[5127]: I0201 09:09:10.078222 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t6mf" event={"ID":"a75aee89-7e67-48c4-ab7e-b9376268685f","Type":"ContainerStarted","Data":"7c7042dd8219cfb65f3ebc02f529c9d6a3e7b06bc128a4373f43bce2074f66d1"} Feb 01 09:09:10 crc kubenswrapper[5127]: I0201 09:09:10.099289 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8t6mf" podStartSLOduration=3.622825554 podStartE2EDuration="7.099261203s" podCreationTimestamp="2026-02-01 09:09:03 +0000 UTC" firstStartedPulling="2026-02-01 09:09:05.998821354 +0000 UTC m=+8496.484723727" lastFinishedPulling="2026-02-01 09:09:09.475257023 +0000 UTC m=+8499.961159376" observedRunningTime="2026-02-01 09:09:10.096679584 +0000 UTC m=+8500.582581947" watchObservedRunningTime="2026-02-01 09:09:10.099261203 +0000 UTC m=+8500.585163566" Feb 01 09:09:11 crc kubenswrapper[5127]: I0201 09:09:11.354344 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:11 crc kubenswrapper[5127]: I0201 09:09:11.354722 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:11 crc kubenswrapper[5127]: I0201 09:09:11.520204 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:11 crc kubenswrapper[5127]: I0201 09:09:11.520274 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:12 crc kubenswrapper[5127]: I0201 09:09:12.426118 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-mswnv" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="registry-server" probeResult="failure" output=< Feb 01 09:09:12 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:09:12 crc kubenswrapper[5127]: > Feb 01 09:09:12 crc kubenswrapper[5127]: I0201 09:09:12.576034 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-mgp9d" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="registry-server" probeResult="failure" output=< Feb 01 09:09:12 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:09:12 crc kubenswrapper[5127]: > Feb 01 09:09:14 crc kubenswrapper[5127]: I0201 09:09:14.319038 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:14 crc kubenswrapper[5127]: I0201 09:09:14.319564 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:15 crc kubenswrapper[5127]: I0201 09:09:15.371847 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-8t6mf" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="registry-server" probeResult="failure" output=< Feb 01 09:09:15 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:09:15 crc kubenswrapper[5127]: > Feb 01 09:09:21 crc kubenswrapper[5127]: I0201 09:09:21.421458 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:21 crc kubenswrapper[5127]: I0201 09:09:21.484864 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:21 crc kubenswrapper[5127]: I0201 09:09:21.589938 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:21 crc kubenswrapper[5127]: I0201 09:09:21.649793 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:24 crc kubenswrapper[5127]: I0201 09:09:24.420192 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:24 crc kubenswrapper[5127]: I0201 09:09:24.467284 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:25 crc kubenswrapper[5127]: I0201 09:09:25.978843 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mswnv"] Feb 01 09:09:25 crc kubenswrapper[5127]: I0201 09:09:25.979460 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mswnv" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="registry-server" containerID="cri-o://d2f9d76fa3090b55b1804fdb56944b0c69bc414f621ca94a4eaad1d10b8d8104" gracePeriod=2 Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.291071 5127 generic.go:334] "Generic (PLEG): container finished" podID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerID="d2f9d76fa3090b55b1804fdb56944b0c69bc414f621ca94a4eaad1d10b8d8104" exitCode=0 Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.291117 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mswnv" event={"ID":"ed1f1eba-207c-4955-8b0a-2372e7a846d7","Type":"ContainerDied","Data":"d2f9d76fa3090b55b1804fdb56944b0c69bc414f621ca94a4eaad1d10b8d8104"} Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.364150 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgp9d"] Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.364873 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mgp9d" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="registry-server" containerID="cri-o://3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668" gracePeriod=2 Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.557232 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.742318 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-utilities\") pod \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.742808 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-catalog-content\") pod \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.742857 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmd2l\" (UniqueName: \"kubernetes.io/projected/ed1f1eba-207c-4955-8b0a-2372e7a846d7-kube-api-access-bmd2l\") pod \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\" (UID: \"ed1f1eba-207c-4955-8b0a-2372e7a846d7\") " Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.743272 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-utilities" (OuterVolumeSpecName: "utilities") pod "ed1f1eba-207c-4955-8b0a-2372e7a846d7" (UID: "ed1f1eba-207c-4955-8b0a-2372e7a846d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.743599 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.750976 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed1f1eba-207c-4955-8b0a-2372e7a846d7-kube-api-access-bmd2l" (OuterVolumeSpecName: "kube-api-access-bmd2l") pod "ed1f1eba-207c-4955-8b0a-2372e7a846d7" (UID: "ed1f1eba-207c-4955-8b0a-2372e7a846d7"). InnerVolumeSpecName "kube-api-access-bmd2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.806687 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed1f1eba-207c-4955-8b0a-2372e7a846d7" (UID: "ed1f1eba-207c-4955-8b0a-2372e7a846d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.845231 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed1f1eba-207c-4955-8b0a-2372e7a846d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.845491 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmd2l\" (UniqueName: \"kubernetes.io/projected/ed1f1eba-207c-4955-8b0a-2372e7a846d7-kube-api-access-bmd2l\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:26 crc kubenswrapper[5127]: I0201 09:09:26.892622 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.050161 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mtws\" (UniqueName: \"kubernetes.io/projected/b6b01070-51fe-4f94-beff-8f448f55b8f9-kube-api-access-6mtws\") pod \"b6b01070-51fe-4f94-beff-8f448f55b8f9\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.050782 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-utilities\") pod \"b6b01070-51fe-4f94-beff-8f448f55b8f9\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.050991 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-catalog-content\") pod \"b6b01070-51fe-4f94-beff-8f448f55b8f9\" (UID: \"b6b01070-51fe-4f94-beff-8f448f55b8f9\") " Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.051780 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-utilities" (OuterVolumeSpecName: "utilities") pod "b6b01070-51fe-4f94-beff-8f448f55b8f9" (UID: "b6b01070-51fe-4f94-beff-8f448f55b8f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.055221 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b01070-51fe-4f94-beff-8f448f55b8f9-kube-api-access-6mtws" (OuterVolumeSpecName: "kube-api-access-6mtws") pod "b6b01070-51fe-4f94-beff-8f448f55b8f9" (UID: "b6b01070-51fe-4f94-beff-8f448f55b8f9"). InnerVolumeSpecName "kube-api-access-6mtws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.113993 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6b01070-51fe-4f94-beff-8f448f55b8f9" (UID: "b6b01070-51fe-4f94-beff-8f448f55b8f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.153081 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mtws\" (UniqueName: \"kubernetes.io/projected/b6b01070-51fe-4f94-beff-8f448f55b8f9-kube-api-access-6mtws\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.153118 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.153128 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6b01070-51fe-4f94-beff-8f448f55b8f9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.306533 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mswnv" event={"ID":"ed1f1eba-207c-4955-8b0a-2372e7a846d7","Type":"ContainerDied","Data":"4579017e1c9779d087afa3e6b9e2d39ecf84952b45e668f120131ddd711e8dc6"} Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.306571 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mswnv" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.306640 5127 scope.go:117] "RemoveContainer" containerID="d2f9d76fa3090b55b1804fdb56944b0c69bc414f621ca94a4eaad1d10b8d8104" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.311427 5127 generic.go:334] "Generic (PLEG): container finished" podID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerID="3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668" exitCode=0 Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.311471 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgp9d" event={"ID":"b6b01070-51fe-4f94-beff-8f448f55b8f9","Type":"ContainerDied","Data":"3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668"} Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.311485 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mgp9d" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.311500 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mgp9d" event={"ID":"b6b01070-51fe-4f94-beff-8f448f55b8f9","Type":"ContainerDied","Data":"b95c0f5469a1e96c064ed6946d0daf4649dd9a7778d79af1a3fb9c4cf3237950"} Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.351057 5127 scope.go:117] "RemoveContainer" containerID="50e74324c9fe92095481052e873631546ee7c6d4b8e299407a44e456a88b2a28" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.381457 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mswnv"] Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.404718 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mswnv"] Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.415517 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mgp9d"] Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.421885 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mgp9d"] Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.422804 5127 scope.go:117] "RemoveContainer" containerID="ad7082bb0e318f5e3f978904a0431bd08056073a255f74c1de14c0e6704c7a3f" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.470847 5127 scope.go:117] "RemoveContainer" containerID="3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.512255 5127 scope.go:117] "RemoveContainer" containerID="fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.538222 5127 scope.go:117] "RemoveContainer" containerID="79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.582005 5127 scope.go:117] "RemoveContainer" containerID="3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668" Feb 01 09:09:27 crc kubenswrapper[5127]: E0201 09:09:27.582663 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668\": container with ID starting with 3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668 not found: ID does not exist" containerID="3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.582705 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668"} err="failed to get container status \"3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668\": rpc error: code = NotFound desc = could not find container \"3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668\": container with ID starting with 3b44ac87f19492ca5d58829261def7fca96cd2a02978f219b0ed472ebf266668 not found: ID does not exist" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.582733 5127 scope.go:117] "RemoveContainer" containerID="fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829" Feb 01 09:09:27 crc kubenswrapper[5127]: E0201 09:09:27.583117 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829\": container with ID starting with fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829 not found: ID does not exist" containerID="fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.583146 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829"} err="failed to get container status \"fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829\": rpc error: code = NotFound desc = could not find container \"fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829\": container with ID starting with fd3e13dba8b04f55b0fd56f99b177e5fca257ea5eb8b020a1e8eafcded881829 not found: ID does not exist" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.583167 5127 scope.go:117] "RemoveContainer" containerID="79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab" Feb 01 09:09:27 crc kubenswrapper[5127]: E0201 09:09:27.583551 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab\": container with ID starting with 79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab not found: ID does not exist" containerID="79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab" Feb 01 09:09:27 crc kubenswrapper[5127]: I0201 09:09:27.583596 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab"} err="failed to get container status \"79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab\": rpc error: code = NotFound desc = could not find container \"79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab\": container with ID starting with 79b866d27e294b40166a6687b4c185defa354e5550e58c34790f81a0705993ab not found: ID does not exist" Feb 01 09:09:28 crc kubenswrapper[5127]: I0201 09:09:28.249553 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" path="/var/lib/kubelet/pods/b6b01070-51fe-4f94-beff-8f448f55b8f9/volumes" Feb 01 09:09:28 crc kubenswrapper[5127]: I0201 09:09:28.251181 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" path="/var/lib/kubelet/pods/ed1f1eba-207c-4955-8b0a-2372e7a846d7/volumes" Feb 01 09:09:30 crc kubenswrapper[5127]: I0201 09:09:30.766906 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t6mf"] Feb 01 09:09:30 crc kubenswrapper[5127]: I0201 09:09:30.767534 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8t6mf" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="registry-server" containerID="cri-o://7c7042dd8219cfb65f3ebc02f529c9d6a3e7b06bc128a4373f43bce2074f66d1" gracePeriod=2 Feb 01 09:09:31 crc kubenswrapper[5127]: I0201 09:09:31.586892 5127 generic.go:334] "Generic (PLEG): container finished" podID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerID="7c7042dd8219cfb65f3ebc02f529c9d6a3e7b06bc128a4373f43bce2074f66d1" exitCode=0 Feb 01 09:09:31 crc kubenswrapper[5127]: I0201 09:09:31.586997 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t6mf" event={"ID":"a75aee89-7e67-48c4-ab7e-b9376268685f","Type":"ContainerDied","Data":"7c7042dd8219cfb65f3ebc02f529c9d6a3e7b06bc128a4373f43bce2074f66d1"} Feb 01 09:09:31 crc kubenswrapper[5127]: I0201 09:09:31.956616 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.079714 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-utilities\") pod \"a75aee89-7e67-48c4-ab7e-b9376268685f\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.079891 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-catalog-content\") pod \"a75aee89-7e67-48c4-ab7e-b9376268685f\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.080140 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zhs2\" (UniqueName: \"kubernetes.io/projected/a75aee89-7e67-48c4-ab7e-b9376268685f-kube-api-access-2zhs2\") pod \"a75aee89-7e67-48c4-ab7e-b9376268685f\" (UID: \"a75aee89-7e67-48c4-ab7e-b9376268685f\") " Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.080791 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-utilities" (OuterVolumeSpecName: "utilities") pod "a75aee89-7e67-48c4-ab7e-b9376268685f" (UID: "a75aee89-7e67-48c4-ab7e-b9376268685f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.090078 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a75aee89-7e67-48c4-ab7e-b9376268685f-kube-api-access-2zhs2" (OuterVolumeSpecName: "kube-api-access-2zhs2") pod "a75aee89-7e67-48c4-ab7e-b9376268685f" (UID: "a75aee89-7e67-48c4-ab7e-b9376268685f"). InnerVolumeSpecName "kube-api-access-2zhs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.114027 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a75aee89-7e67-48c4-ab7e-b9376268685f" (UID: "a75aee89-7e67-48c4-ab7e-b9376268685f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.184045 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.184086 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a75aee89-7e67-48c4-ab7e-b9376268685f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.184103 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zhs2\" (UniqueName: \"kubernetes.io/projected/a75aee89-7e67-48c4-ab7e-b9376268685f-kube-api-access-2zhs2\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.598923 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t6mf" event={"ID":"a75aee89-7e67-48c4-ab7e-b9376268685f","Type":"ContainerDied","Data":"58ded7d9fc31502704e10e51bfa804c0f7b310191999fe87fea7a416180ce39b"} Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.599019 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t6mf" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.599287 5127 scope.go:117] "RemoveContainer" containerID="7c7042dd8219cfb65f3ebc02f529c9d6a3e7b06bc128a4373f43bce2074f66d1" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.625237 5127 scope.go:117] "RemoveContainer" containerID="4e75fb741a139554b167ebf4dc678480a2f8889cf4e97dbeda62457d91c69aba" Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.632162 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t6mf"] Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.650310 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t6mf"] Feb 01 09:09:32 crc kubenswrapper[5127]: I0201 09:09:32.658904 5127 scope.go:117] "RemoveContainer" containerID="848422a89d50c6567cbad4fa8d06a3bdee99c88659e359ce809545d63f316f8f" Feb 01 09:09:34 crc kubenswrapper[5127]: I0201 09:09:34.253303 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" path="/var/lib/kubelet/pods/a75aee89-7e67-48c4-ab7e-b9376268685f/volumes" Feb 01 09:09:41 crc kubenswrapper[5127]: I0201 09:09:41.697639 5127 generic.go:334] "Generic (PLEG): container finished" podID="fbcbec86-272a-408e-9f57-5478b28fe0ed" containerID="5c7cf4eaed1bc4f4a7dc9fd90da0237ce0910171dcd67b999ddfbc4de78ee082" exitCode=0 Feb 01 09:09:41 crc kubenswrapper[5127]: I0201 09:09:41.697749 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" event={"ID":"fbcbec86-272a-408e-9f57-5478b28fe0ed","Type":"ContainerDied","Data":"5c7cf4eaed1bc4f4a7dc9fd90da0237ce0910171dcd67b999ddfbc4de78ee082"} Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.226226 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.250360 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker\") pod \"fbcbec86-272a-408e-9f57-5478b28fe0ed\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.250621 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxk5n\" (UniqueName: \"kubernetes.io/projected/fbcbec86-272a-408e-9f57-5478b28fe0ed-kube-api-access-mxk5n\") pod \"fbcbec86-272a-408e-9f57-5478b28fe0ed\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.250647 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-inventory\") pod \"fbcbec86-272a-408e-9f57-5478b28fe0ed\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.258757 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcbec86-272a-408e-9f57-5478b28fe0ed-kube-api-access-mxk5n" (OuterVolumeSpecName: "kube-api-access-mxk5n") pod "fbcbec86-272a-408e-9f57-5478b28fe0ed" (UID: "fbcbec86-272a-408e-9f57-5478b28fe0ed"). InnerVolumeSpecName "kube-api-access-mxk5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.281980 5127 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker podName:fbcbec86-272a-408e-9f57-5478b28fe0ed nodeName:}" failed. No retries permitted until 2026-02-01 09:09:43.78195448 +0000 UTC m=+8534.267856843 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-networker" (UniqueName: "kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker") pod "fbcbec86-272a-408e-9f57-5478b28fe0ed" (UID: "fbcbec86-272a-408e-9f57-5478b28fe0ed") : error deleting /var/lib/kubelet/pods/fbcbec86-272a-408e-9f57-5478b28fe0ed/volume-subpaths: remove /var/lib/kubelet/pods/fbcbec86-272a-408e-9f57-5478b28fe0ed/volume-subpaths: no such file or directory Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.287599 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-inventory" (OuterVolumeSpecName: "inventory") pod "fbcbec86-272a-408e-9f57-5478b28fe0ed" (UID: "fbcbec86-272a-408e-9f57-5478b28fe0ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.353266 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxk5n\" (UniqueName: \"kubernetes.io/projected/fbcbec86-272a-408e-9f57-5478b28fe0ed-kube-api-access-mxk5n\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.353303 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.722302 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" event={"ID":"fbcbec86-272a-408e-9f57-5478b28fe0ed","Type":"ContainerDied","Data":"1b40a6882080fe816872f51753f4df4d38bea0a92f3cfdd8efa241bba099ad3f"} Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.722384 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b40a6882080fe816872f51753f4df4d38bea0a92f3cfdd8efa241bba099ad3f" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.722412 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-9mfdv" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.852649 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-networker-8n9vd"] Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853184 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="extract-content" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853208 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="extract-content" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853236 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853245 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853264 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853273 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853295 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcbec86-272a-408e-9f57-5478b28fe0ed" containerName="configure-network-openstack-openstack-networker" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853307 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcbec86-272a-408e-9f57-5478b28fe0ed" containerName="configure-network-openstack-openstack-networker" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853323 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="extract-content" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853356 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="extract-content" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853382 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="extract-content" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853390 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="extract-content" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853410 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="extract-utilities" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853419 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="extract-utilities" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853428 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="extract-utilities" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853437 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="extract-utilities" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853450 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853459 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: E0201 09:09:43.853477 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="extract-utilities" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853487 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="extract-utilities" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853764 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcbec86-272a-408e-9f57-5478b28fe0ed" containerName="configure-network-openstack-openstack-networker" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853786 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed1f1eba-207c-4955-8b0a-2372e7a846d7" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853804 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="a75aee89-7e67-48c4-ab7e-b9376268685f" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.853817 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b01070-51fe-4f94-beff-8f448f55b8f9" containerName="registry-server" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.854853 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.868160 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-8n9vd"] Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.869720 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker\") pod \"fbcbec86-272a-408e-9f57-5478b28fe0ed\" (UID: \"fbcbec86-272a-408e-9f57-5478b28fe0ed\") " Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.888501 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "fbcbec86-272a-408e-9f57-5478b28fe0ed" (UID: "fbcbec86-272a-408e-9f57-5478b28fe0ed"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.972532 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-inventory\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.972776 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fv4\" (UniqueName: \"kubernetes.io/projected/e8c396ea-90b7-4ace-962d-2dfa97f0488a-kube-api-access-t5fv4\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.972932 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:43 crc kubenswrapper[5127]: I0201 09:09:43.973010 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/fbcbec86-272a-408e-9f57-5478b28fe0ed-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.075163 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fv4\" (UniqueName: \"kubernetes.io/projected/e8c396ea-90b7-4ace-962d-2dfa97f0488a-kube-api-access-t5fv4\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.075266 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.075372 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-inventory\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.079341 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.079845 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-inventory\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.095486 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fv4\" (UniqueName: \"kubernetes.io/projected/e8c396ea-90b7-4ace-962d-2dfa97f0488a-kube-api-access-t5fv4\") pod \"validate-network-openstack-openstack-networker-8n9vd\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.223096 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:44 crc kubenswrapper[5127]: I0201 09:09:44.867258 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-8n9vd"] Feb 01 09:09:45 crc kubenswrapper[5127]: I0201 09:09:45.741224 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" event={"ID":"e8c396ea-90b7-4ace-962d-2dfa97f0488a","Type":"ContainerStarted","Data":"71c8b9cc993350ebf7e9d312f219b09f741f26ea847d1fc53c720462d8bc0e39"} Feb 01 09:09:45 crc kubenswrapper[5127]: I0201 09:09:45.741757 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" event={"ID":"e8c396ea-90b7-4ace-962d-2dfa97f0488a","Type":"ContainerStarted","Data":"ecc93c80674c9e120312f39574ddfa80326d90c7c47894343d94c0d5f8872921"} Feb 01 09:09:45 crc kubenswrapper[5127]: I0201 09:09:45.763256 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" podStartSLOduration=2.281650005 podStartE2EDuration="2.763229924s" podCreationTimestamp="2026-02-01 09:09:43 +0000 UTC" firstStartedPulling="2026-02-01 09:09:44.869659545 +0000 UTC m=+8535.355561918" lastFinishedPulling="2026-02-01 09:09:45.351239474 +0000 UTC m=+8535.837141837" observedRunningTime="2026-02-01 09:09:45.759297018 +0000 UTC m=+8536.245199401" watchObservedRunningTime="2026-02-01 09:09:45.763229924 +0000 UTC m=+8536.249132327" Feb 01 09:09:51 crc kubenswrapper[5127]: I0201 09:09:51.822396 5127 generic.go:334] "Generic (PLEG): container finished" podID="e8c396ea-90b7-4ace-962d-2dfa97f0488a" containerID="71c8b9cc993350ebf7e9d312f219b09f741f26ea847d1fc53c720462d8bc0e39" exitCode=0 Feb 01 09:09:51 crc kubenswrapper[5127]: I0201 09:09:51.822834 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" event={"ID":"e8c396ea-90b7-4ace-962d-2dfa97f0488a","Type":"ContainerDied","Data":"71c8b9cc993350ebf7e9d312f219b09f741f26ea847d1fc53c720462d8bc0e39"} Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.343638 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.391514 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-inventory\") pod \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.391599 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5fv4\" (UniqueName: \"kubernetes.io/projected/e8c396ea-90b7-4ace-962d-2dfa97f0488a-kube-api-access-t5fv4\") pod \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.392668 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-ssh-key-openstack-networker\") pod \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\" (UID: \"e8c396ea-90b7-4ace-962d-2dfa97f0488a\") " Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.399632 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8c396ea-90b7-4ace-962d-2dfa97f0488a-kube-api-access-t5fv4" (OuterVolumeSpecName: "kube-api-access-t5fv4") pod "e8c396ea-90b7-4ace-962d-2dfa97f0488a" (UID: "e8c396ea-90b7-4ace-962d-2dfa97f0488a"). InnerVolumeSpecName "kube-api-access-t5fv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.437007 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e8c396ea-90b7-4ace-962d-2dfa97f0488a" (UID: "e8c396ea-90b7-4ace-962d-2dfa97f0488a"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.441348 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-inventory" (OuterVolumeSpecName: "inventory") pod "e8c396ea-90b7-4ace-962d-2dfa97f0488a" (UID: "e8c396ea-90b7-4ace-962d-2dfa97f0488a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.494092 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.494117 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5fv4\" (UniqueName: \"kubernetes.io/projected/e8c396ea-90b7-4ace-962d-2dfa97f0488a-kube-api-access-t5fv4\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.494129 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e8c396ea-90b7-4ace-962d-2dfa97f0488a-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.849978 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" event={"ID":"e8c396ea-90b7-4ace-962d-2dfa97f0488a","Type":"ContainerDied","Data":"ecc93c80674c9e120312f39574ddfa80326d90c7c47894343d94c0d5f8872921"} Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.850022 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecc93c80674c9e120312f39574ddfa80326d90c7c47894343d94c0d5f8872921" Feb 01 09:09:53 crc kubenswrapper[5127]: I0201 09:09:53.850068 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-8n9vd" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.021364 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-networker-dbs22"] Feb 01 09:09:54 crc kubenswrapper[5127]: E0201 09:09:54.021846 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8c396ea-90b7-4ace-962d-2dfa97f0488a" containerName="validate-network-openstack-openstack-networker" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.021870 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8c396ea-90b7-4ace-962d-2dfa97f0488a" containerName="validate-network-openstack-openstack-networker" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.022088 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8c396ea-90b7-4ace-962d-2dfa97f0488a" containerName="validate-network-openstack-openstack-networker" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.022767 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.025695 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.029061 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.047168 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-dbs22"] Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.105240 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.105321 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vj5q\" (UniqueName: \"kubernetes.io/projected/d9ccb5f5-614a-4b43-972c-20b2b907a88c-kube-api-access-8vj5q\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.105453 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-inventory\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.206263 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-inventory\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.206680 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.206813 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vj5q\" (UniqueName: \"kubernetes.io/projected/d9ccb5f5-614a-4b43-972c-20b2b907a88c-kube-api-access-8vj5q\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.212923 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-inventory\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.217410 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.226136 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vj5q\" (UniqueName: \"kubernetes.io/projected/d9ccb5f5-614a-4b43-972c-20b2b907a88c-kube-api-access-8vj5q\") pod \"install-os-openstack-openstack-networker-dbs22\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.340854 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:09:54 crc kubenswrapper[5127]: I0201 09:09:54.935298 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-dbs22"] Feb 01 09:09:55 crc kubenswrapper[5127]: I0201 09:09:55.876564 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-dbs22" event={"ID":"d9ccb5f5-614a-4b43-972c-20b2b907a88c","Type":"ContainerStarted","Data":"58accce57e7f533b564e737d0c51aa775644fc7f7c700671054d62049f409e4d"} Feb 01 09:09:55 crc kubenswrapper[5127]: I0201 09:09:55.877233 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-dbs22" event={"ID":"d9ccb5f5-614a-4b43-972c-20b2b907a88c","Type":"ContainerStarted","Data":"521fb81ac77c64893d6ba80328115404a95d3dd5276052ede088448ff3e92b6b"} Feb 01 09:09:55 crc kubenswrapper[5127]: I0201 09:09:55.891097 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-networker-dbs22" podStartSLOduration=2.470580283 podStartE2EDuration="2.89108465s" podCreationTimestamp="2026-02-01 09:09:53 +0000 UTC" firstStartedPulling="2026-02-01 09:09:54.936832327 +0000 UTC m=+8545.422734700" lastFinishedPulling="2026-02-01 09:09:55.357336664 +0000 UTC m=+8545.843239067" observedRunningTime="2026-02-01 09:09:55.890013401 +0000 UTC m=+8546.375915764" watchObservedRunningTime="2026-02-01 09:09:55.89108465 +0000 UTC m=+8546.376987003" Feb 01 09:10:02 crc kubenswrapper[5127]: I0201 09:10:02.977045 5127 generic.go:334] "Generic (PLEG): container finished" podID="e605bdd8-d806-44ed-a832-fc7917f53089" containerID="8e613c8a170c4c15bea74fd9e1d5a46e78fca1d577b2cdf1351f07b886cad987" exitCode=0 Feb 01 09:10:02 crc kubenswrapper[5127]: I0201 09:10:02.977095 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-bm985" event={"ID":"e605bdd8-d806-44ed-a832-fc7917f53089","Type":"ContainerDied","Data":"8e613c8a170c4c15bea74fd9e1d5a46e78fca1d577b2cdf1351f07b886cad987"} Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.510292 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.670350 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm2ml\" (UniqueName: \"kubernetes.io/projected/e605bdd8-d806-44ed-a832-fc7917f53089-kube-api-access-qm2ml\") pod \"e605bdd8-d806-44ed-a832-fc7917f53089\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.670393 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ssh-key-openstack-cell1\") pod \"e605bdd8-d806-44ed-a832-fc7917f53089\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.670499 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ceph\") pod \"e605bdd8-d806-44ed-a832-fc7917f53089\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.670797 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-inventory\") pod \"e605bdd8-d806-44ed-a832-fc7917f53089\" (UID: \"e605bdd8-d806-44ed-a832-fc7917f53089\") " Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.676676 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e605bdd8-d806-44ed-a832-fc7917f53089-kube-api-access-qm2ml" (OuterVolumeSpecName: "kube-api-access-qm2ml") pod "e605bdd8-d806-44ed-a832-fc7917f53089" (UID: "e605bdd8-d806-44ed-a832-fc7917f53089"). InnerVolumeSpecName "kube-api-access-qm2ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.676803 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ceph" (OuterVolumeSpecName: "ceph") pod "e605bdd8-d806-44ed-a832-fc7917f53089" (UID: "e605bdd8-d806-44ed-a832-fc7917f53089"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.697845 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-inventory" (OuterVolumeSpecName: "inventory") pod "e605bdd8-d806-44ed-a832-fc7917f53089" (UID: "e605bdd8-d806-44ed-a832-fc7917f53089"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.702751 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e605bdd8-d806-44ed-a832-fc7917f53089" (UID: "e605bdd8-d806-44ed-a832-fc7917f53089"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.773091 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.773121 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.773134 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm2ml\" (UniqueName: \"kubernetes.io/projected/e605bdd8-d806-44ed-a832-fc7917f53089-kube-api-access-qm2ml\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:04 crc kubenswrapper[5127]: I0201 09:10:04.773144 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e605bdd8-d806-44ed-a832-fc7917f53089-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.003422 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-bm985" event={"ID":"e605bdd8-d806-44ed-a832-fc7917f53089","Type":"ContainerDied","Data":"95d359a05a929b2f632f4b61829ec11b624d00bf499a90fb75ac6323eb497e9e"} Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.003461 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95d359a05a929b2f632f4b61829ec11b624d00bf499a90fb75ac6323eb497e9e" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.003892 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-bm985" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.094228 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-vhpqj"] Feb 01 09:10:05 crc kubenswrapper[5127]: E0201 09:10:05.094862 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e605bdd8-d806-44ed-a832-fc7917f53089" containerName="configure-network-openstack-openstack-cell1" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.094893 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e605bdd8-d806-44ed-a832-fc7917f53089" containerName="configure-network-openstack-openstack-cell1" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.095195 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e605bdd8-d806-44ed-a832-fc7917f53089" containerName="configure-network-openstack-openstack-cell1" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.096190 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.098718 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.100051 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.114519 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-vhpqj"] Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.182573 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.182845 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ceph\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.182895 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-inventory\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.182921 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhg5l\" (UniqueName: \"kubernetes.io/projected/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-kube-api-access-qhg5l\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.284949 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ceph\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.285014 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-inventory\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.285059 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhg5l\" (UniqueName: \"kubernetes.io/projected/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-kube-api-access-qhg5l\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.285096 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.289442 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-inventory\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.291331 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ceph\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.292438 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.301034 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhg5l\" (UniqueName: \"kubernetes.io/projected/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-kube-api-access-qhg5l\") pod \"validate-network-openstack-openstack-cell1-vhpqj\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:05 crc kubenswrapper[5127]: I0201 09:10:05.426370 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:06 crc kubenswrapper[5127]: I0201 09:10:06.070213 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-vhpqj"] Feb 01 09:10:07 crc kubenswrapper[5127]: I0201 09:10:07.042639 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" event={"ID":"7b6d0276-99a8-40e9-8c7a-53633dc1c58f","Type":"ContainerStarted","Data":"49537168ee07b0ac39f6bc79e67483f00060333d4ca7fa70ff61636a0cc3067b"} Feb 01 09:10:07 crc kubenswrapper[5127]: I0201 09:10:07.043057 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" event={"ID":"7b6d0276-99a8-40e9-8c7a-53633dc1c58f","Type":"ContainerStarted","Data":"94dc366965361ebd2f9ac8c94cf49e44fef716625b040776d3e8dcf43aee20ae"} Feb 01 09:10:07 crc kubenswrapper[5127]: I0201 09:10:07.073471 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" podStartSLOduration=1.682250018 podStartE2EDuration="2.073450541s" podCreationTimestamp="2026-02-01 09:10:05 +0000 UTC" firstStartedPulling="2026-02-01 09:10:06.075936361 +0000 UTC m=+8556.561838734" lastFinishedPulling="2026-02-01 09:10:06.467136854 +0000 UTC m=+8556.953039257" observedRunningTime="2026-02-01 09:10:07.063137075 +0000 UTC m=+8557.549039438" watchObservedRunningTime="2026-02-01 09:10:07.073450541 +0000 UTC m=+8557.559352914" Feb 01 09:10:12 crc kubenswrapper[5127]: I0201 09:10:12.101463 5127 generic.go:334] "Generic (PLEG): container finished" podID="7b6d0276-99a8-40e9-8c7a-53633dc1c58f" containerID="49537168ee07b0ac39f6bc79e67483f00060333d4ca7fa70ff61636a0cc3067b" exitCode=0 Feb 01 09:10:12 crc kubenswrapper[5127]: I0201 09:10:12.101535 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" event={"ID":"7b6d0276-99a8-40e9-8c7a-53633dc1c58f","Type":"ContainerDied","Data":"49537168ee07b0ac39f6bc79e67483f00060333d4ca7fa70ff61636a0cc3067b"} Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.625311 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.793813 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ceph\") pod \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.794056 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhg5l\" (UniqueName: \"kubernetes.io/projected/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-kube-api-access-qhg5l\") pod \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.794881 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-inventory\") pod \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.794937 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ssh-key-openstack-cell1\") pod \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\" (UID: \"7b6d0276-99a8-40e9-8c7a-53633dc1c58f\") " Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.800946 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-kube-api-access-qhg5l" (OuterVolumeSpecName: "kube-api-access-qhg5l") pod "7b6d0276-99a8-40e9-8c7a-53633dc1c58f" (UID: "7b6d0276-99a8-40e9-8c7a-53633dc1c58f"). InnerVolumeSpecName "kube-api-access-qhg5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.805714 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ceph" (OuterVolumeSpecName: "ceph") pod "7b6d0276-99a8-40e9-8c7a-53633dc1c58f" (UID: "7b6d0276-99a8-40e9-8c7a-53633dc1c58f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.840517 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-inventory" (OuterVolumeSpecName: "inventory") pod "7b6d0276-99a8-40e9-8c7a-53633dc1c58f" (UID: "7b6d0276-99a8-40e9-8c7a-53633dc1c58f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.840544 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7b6d0276-99a8-40e9-8c7a-53633dc1c58f" (UID: "7b6d0276-99a8-40e9-8c7a-53633dc1c58f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.897467 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.897525 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.897537 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhg5l\" (UniqueName: \"kubernetes.io/projected/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-kube-api-access-qhg5l\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:13 crc kubenswrapper[5127]: I0201 09:10:13.897546 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b6d0276-99a8-40e9-8c7a-53633dc1c58f-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.128664 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" event={"ID":"7b6d0276-99a8-40e9-8c7a-53633dc1c58f","Type":"ContainerDied","Data":"94dc366965361ebd2f9ac8c94cf49e44fef716625b040776d3e8dcf43aee20ae"} Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.129100 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94dc366965361ebd2f9ac8c94cf49e44fef716625b040776d3e8dcf43aee20ae" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.128711 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-vhpqj" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.273495 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-d6vxn"] Feb 01 09:10:14 crc kubenswrapper[5127]: E0201 09:10:14.285064 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6d0276-99a8-40e9-8c7a-53633dc1c58f" containerName="validate-network-openstack-openstack-cell1" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.285170 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6d0276-99a8-40e9-8c7a-53633dc1c58f" containerName="validate-network-openstack-openstack-cell1" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.286118 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6d0276-99a8-40e9-8c7a-53633dc1c58f" containerName="validate-network-openstack-openstack-cell1" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.287631 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-d6vxn"] Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.287770 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.295797 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.296090 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.410363 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.410430 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-inventory\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.410532 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26q2d\" (UniqueName: \"kubernetes.io/projected/11fb127c-46a4-421b-aefe-78b11051f499-kube-api-access-26q2d\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.410555 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ceph\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.533820 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.533914 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-inventory\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.533990 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26q2d\" (UniqueName: \"kubernetes.io/projected/11fb127c-46a4-421b-aefe-78b11051f499-kube-api-access-26q2d\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.534030 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ceph\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.539089 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-inventory\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.543081 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.545070 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ceph\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.556429 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26q2d\" (UniqueName: \"kubernetes.io/projected/11fb127c-46a4-421b-aefe-78b11051f499-kube-api-access-26q2d\") pod \"install-os-openstack-openstack-cell1-d6vxn\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:14 crc kubenswrapper[5127]: I0201 09:10:14.608099 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:10:15 crc kubenswrapper[5127]: I0201 09:10:15.238315 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-d6vxn"] Feb 01 09:10:16 crc kubenswrapper[5127]: I0201 09:10:16.151560 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" event={"ID":"11fb127c-46a4-421b-aefe-78b11051f499","Type":"ContainerStarted","Data":"fd0fa11b9095942911db46c77ebf4bf367060f2a32d3ea74f9fb7b288a12d317"} Feb 01 09:10:16 crc kubenswrapper[5127]: I0201 09:10:16.152185 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" event={"ID":"11fb127c-46a4-421b-aefe-78b11051f499","Type":"ContainerStarted","Data":"892942d2e614cd8aafc3c451a4c6be8bb00b72abd32eef441a2fd2c33d5f37f4"} Feb 01 09:10:16 crc kubenswrapper[5127]: I0201 09:10:16.187858 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" podStartSLOduration=1.7442302870000002 podStartE2EDuration="2.187826691s" podCreationTimestamp="2026-02-01 09:10:14 +0000 UTC" firstStartedPulling="2026-02-01 09:10:15.240537535 +0000 UTC m=+8565.726439938" lastFinishedPulling="2026-02-01 09:10:15.684133949 +0000 UTC m=+8566.170036342" observedRunningTime="2026-02-01 09:10:16.171240847 +0000 UTC m=+8566.657143280" watchObservedRunningTime="2026-02-01 09:10:16.187826691 +0000 UTC m=+8566.673729094" Feb 01 09:10:47 crc kubenswrapper[5127]: I0201 09:10:47.496316 5127 generic.go:334] "Generic (PLEG): container finished" podID="d9ccb5f5-614a-4b43-972c-20b2b907a88c" containerID="58accce57e7f533b564e737d0c51aa775644fc7f7c700671054d62049f409e4d" exitCode=0 Feb 01 09:10:47 crc kubenswrapper[5127]: I0201 09:10:47.496471 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-dbs22" event={"ID":"d9ccb5f5-614a-4b43-972c-20b2b907a88c","Type":"ContainerDied","Data":"58accce57e7f533b564e737d0c51aa775644fc7f7c700671054d62049f409e4d"} Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.091877 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.177216 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-inventory\") pod \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.177529 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-ssh-key-openstack-networker\") pod \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.177685 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vj5q\" (UniqueName: \"kubernetes.io/projected/d9ccb5f5-614a-4b43-972c-20b2b907a88c-kube-api-access-8vj5q\") pod \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\" (UID: \"d9ccb5f5-614a-4b43-972c-20b2b907a88c\") " Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.182370 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ccb5f5-614a-4b43-972c-20b2b907a88c-kube-api-access-8vj5q" (OuterVolumeSpecName: "kube-api-access-8vj5q") pod "d9ccb5f5-614a-4b43-972c-20b2b907a88c" (UID: "d9ccb5f5-614a-4b43-972c-20b2b907a88c"). InnerVolumeSpecName "kube-api-access-8vj5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.205793 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-inventory" (OuterVolumeSpecName: "inventory") pod "d9ccb5f5-614a-4b43-972c-20b2b907a88c" (UID: "d9ccb5f5-614a-4b43-972c-20b2b907a88c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.234815 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "d9ccb5f5-614a-4b43-972c-20b2b907a88c" (UID: "d9ccb5f5-614a-4b43-972c-20b2b907a88c"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.280378 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vj5q\" (UniqueName: \"kubernetes.io/projected/d9ccb5f5-614a-4b43-972c-20b2b907a88c-kube-api-access-8vj5q\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.280439 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.280451 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/d9ccb5f5-614a-4b43-972c-20b2b907a88c-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.529374 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-dbs22" event={"ID":"d9ccb5f5-614a-4b43-972c-20b2b907a88c","Type":"ContainerDied","Data":"521fb81ac77c64893d6ba80328115404a95d3dd5276052ede088448ff3e92b6b"} Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.529434 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="521fb81ac77c64893d6ba80328115404a95d3dd5276052ede088448ff3e92b6b" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.529500 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-dbs22" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.671676 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-networker-8bccd"] Feb 01 09:10:49 crc kubenswrapper[5127]: E0201 09:10:49.672236 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ccb5f5-614a-4b43-972c-20b2b907a88c" containerName="install-os-openstack-openstack-networker" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.672258 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ccb5f5-614a-4b43-972c-20b2b907a88c" containerName="install-os-openstack-openstack-networker" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.672547 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ccb5f5-614a-4b43-972c-20b2b907a88c" containerName="install-os-openstack-openstack-networker" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.673489 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.676958 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.680134 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-8bccd"] Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.676958 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.803551 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-inventory\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.803662 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn8w2\" (UniqueName: \"kubernetes.io/projected/df12e809-8735-4243-9027-a9aefe524c55-kube-api-access-kn8w2\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.803718 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.908973 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-inventory\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.909123 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn8w2\" (UniqueName: \"kubernetes.io/projected/df12e809-8735-4243-9027-a9aefe524c55-kube-api-access-kn8w2\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.909178 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.913491 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-inventory\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.920707 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.940310 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn8w2\" (UniqueName: \"kubernetes.io/projected/df12e809-8735-4243-9027-a9aefe524c55-kube-api-access-kn8w2\") pod \"configure-os-openstack-openstack-networker-8bccd\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:49 crc kubenswrapper[5127]: I0201 09:10:49.997764 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:10:50 crc kubenswrapper[5127]: W0201 09:10:50.603768 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf12e809_8735_4243_9027_a9aefe524c55.slice/crio-0bd995e709e6912d524191cc3ef362044ec82ccb76990d99670cf0cebbe5780c WatchSource:0}: Error finding container 0bd995e709e6912d524191cc3ef362044ec82ccb76990d99670cf0cebbe5780c: Status 404 returned error can't find the container with id 0bd995e709e6912d524191cc3ef362044ec82ccb76990d99670cf0cebbe5780c Feb 01 09:10:50 crc kubenswrapper[5127]: I0201 09:10:50.609368 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-8bccd"] Feb 01 09:10:51 crc kubenswrapper[5127]: I0201 09:10:51.552633 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-8bccd" event={"ID":"df12e809-8735-4243-9027-a9aefe524c55","Type":"ContainerStarted","Data":"29f6b4b6c4ec27e74b5bfc44ade723536516465d3eff3dbd01169d438f57b60a"} Feb 01 09:10:51 crc kubenswrapper[5127]: I0201 09:10:51.553191 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-8bccd" event={"ID":"df12e809-8735-4243-9027-a9aefe524c55","Type":"ContainerStarted","Data":"0bd995e709e6912d524191cc3ef362044ec82ccb76990d99670cf0cebbe5780c"} Feb 01 09:10:51 crc kubenswrapper[5127]: I0201 09:10:51.574797 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-networker-8bccd" podStartSLOduration=2.120489731 podStartE2EDuration="2.574779901s" podCreationTimestamp="2026-02-01 09:10:49 +0000 UTC" firstStartedPulling="2026-02-01 09:10:50.607279405 +0000 UTC m=+8601.093181808" lastFinishedPulling="2026-02-01 09:10:51.061569575 +0000 UTC m=+8601.547471978" observedRunningTime="2026-02-01 09:10:51.567616479 +0000 UTC m=+8602.053518842" watchObservedRunningTime="2026-02-01 09:10:51.574779901 +0000 UTC m=+8602.060682254" Feb 01 09:11:04 crc kubenswrapper[5127]: I0201 09:11:04.734144 5127 generic.go:334] "Generic (PLEG): container finished" podID="11fb127c-46a4-421b-aefe-78b11051f499" containerID="fd0fa11b9095942911db46c77ebf4bf367060f2a32d3ea74f9fb7b288a12d317" exitCode=0 Feb 01 09:11:04 crc kubenswrapper[5127]: I0201 09:11:04.734812 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" event={"ID":"11fb127c-46a4-421b-aefe-78b11051f499","Type":"ContainerDied","Data":"fd0fa11b9095942911db46c77ebf4bf367060f2a32d3ea74f9fb7b288a12d317"} Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.372170 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.505618 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26q2d\" (UniqueName: \"kubernetes.io/projected/11fb127c-46a4-421b-aefe-78b11051f499-kube-api-access-26q2d\") pod \"11fb127c-46a4-421b-aefe-78b11051f499\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.506018 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-inventory\") pod \"11fb127c-46a4-421b-aefe-78b11051f499\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.506264 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ssh-key-openstack-cell1\") pod \"11fb127c-46a4-421b-aefe-78b11051f499\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.506346 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ceph\") pod \"11fb127c-46a4-421b-aefe-78b11051f499\" (UID: \"11fb127c-46a4-421b-aefe-78b11051f499\") " Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.511979 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ceph" (OuterVolumeSpecName: "ceph") pod "11fb127c-46a4-421b-aefe-78b11051f499" (UID: "11fb127c-46a4-421b-aefe-78b11051f499"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.513498 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fb127c-46a4-421b-aefe-78b11051f499-kube-api-access-26q2d" (OuterVolumeSpecName: "kube-api-access-26q2d") pod "11fb127c-46a4-421b-aefe-78b11051f499" (UID: "11fb127c-46a4-421b-aefe-78b11051f499"). InnerVolumeSpecName "kube-api-access-26q2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.542821 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-inventory" (OuterVolumeSpecName: "inventory") pod "11fb127c-46a4-421b-aefe-78b11051f499" (UID: "11fb127c-46a4-421b-aefe-78b11051f499"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.556011 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "11fb127c-46a4-421b-aefe-78b11051f499" (UID: "11fb127c-46a4-421b-aefe-78b11051f499"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.609520 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.609560 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.609575 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26q2d\" (UniqueName: \"kubernetes.io/projected/11fb127c-46a4-421b-aefe-78b11051f499-kube-api-access-26q2d\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.609605 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11fb127c-46a4-421b-aefe-78b11051f499-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.741309 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.741368 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.759297 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" event={"ID":"11fb127c-46a4-421b-aefe-78b11051f499","Type":"ContainerDied","Data":"892942d2e614cd8aafc3c451a4c6be8bb00b72abd32eef441a2fd2c33d5f37f4"} Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.759342 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="892942d2e614cd8aafc3c451a4c6be8bb00b72abd32eef441a2fd2c33d5f37f4" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.759355 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-d6vxn" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.855785 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-hfch5"] Feb 01 09:11:06 crc kubenswrapper[5127]: E0201 09:11:06.857326 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fb127c-46a4-421b-aefe-78b11051f499" containerName="install-os-openstack-openstack-cell1" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.857529 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fb127c-46a4-421b-aefe-78b11051f499" containerName="install-os-openstack-openstack-cell1" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.858166 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fb127c-46a4-421b-aefe-78b11051f499" containerName="install-os-openstack-openstack-cell1" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.859787 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.863512 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.863674 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.894069 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-hfch5"] Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.915772 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8kl\" (UniqueName: \"kubernetes.io/projected/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-kube-api-access-hc8kl\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.915956 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ceph\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.916122 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:06 crc kubenswrapper[5127]: I0201 09:11:06.916425 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-inventory\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.018914 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-inventory\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.019021 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8kl\" (UniqueName: \"kubernetes.io/projected/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-kube-api-access-hc8kl\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.019236 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ceph\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.019394 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.023459 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-inventory\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.024729 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ceph\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.031626 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.039506 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8kl\" (UniqueName: \"kubernetes.io/projected/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-kube-api-access-hc8kl\") pod \"configure-os-openstack-openstack-cell1-hfch5\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.203958 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:11:07 crc kubenswrapper[5127]: I0201 09:11:07.851233 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-hfch5"] Feb 01 09:11:08 crc kubenswrapper[5127]: I0201 09:11:08.786867 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" event={"ID":"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e","Type":"ContainerStarted","Data":"a3a05301561dd4ab015efddbccf3d2ee9a4650531e96596915cf9405e3637e75"} Feb 01 09:11:08 crc kubenswrapper[5127]: I0201 09:11:08.787704 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" event={"ID":"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e","Type":"ContainerStarted","Data":"d20019750d45b139a649428fb4bd79768397b380305b4638966daec35fd22485"} Feb 01 09:11:08 crc kubenswrapper[5127]: I0201 09:11:08.818706 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" podStartSLOduration=2.263405511 podStartE2EDuration="2.818686872s" podCreationTimestamp="2026-02-01 09:11:06 +0000 UTC" firstStartedPulling="2026-02-01 09:11:07.858560482 +0000 UTC m=+8618.344462845" lastFinishedPulling="2026-02-01 09:11:08.413841803 +0000 UTC m=+8618.899744206" observedRunningTime="2026-02-01 09:11:08.809340741 +0000 UTC m=+8619.295243114" watchObservedRunningTime="2026-02-01 09:11:08.818686872 +0000 UTC m=+8619.304589235" Feb 01 09:11:36 crc kubenswrapper[5127]: I0201 09:11:36.741555 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:11:36 crc kubenswrapper[5127]: I0201 09:11:36.742490 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:11:45 crc kubenswrapper[5127]: I0201 09:11:45.240730 5127 generic.go:334] "Generic (PLEG): container finished" podID="df12e809-8735-4243-9027-a9aefe524c55" containerID="29f6b4b6c4ec27e74b5bfc44ade723536516465d3eff3dbd01169d438f57b60a" exitCode=0 Feb 01 09:11:45 crc kubenswrapper[5127]: I0201 09:11:45.240839 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-8bccd" event={"ID":"df12e809-8735-4243-9027-a9aefe524c55","Type":"ContainerDied","Data":"29f6b4b6c4ec27e74b5bfc44ade723536516465d3eff3dbd01169d438f57b60a"} Feb 01 09:11:46 crc kubenswrapper[5127]: I0201 09:11:46.820445 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.004139 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn8w2\" (UniqueName: \"kubernetes.io/projected/df12e809-8735-4243-9027-a9aefe524c55-kube-api-access-kn8w2\") pod \"df12e809-8735-4243-9027-a9aefe524c55\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.004321 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-ssh-key-openstack-networker\") pod \"df12e809-8735-4243-9027-a9aefe524c55\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.004353 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-inventory\") pod \"df12e809-8735-4243-9027-a9aefe524c55\" (UID: \"df12e809-8735-4243-9027-a9aefe524c55\") " Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.009745 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df12e809-8735-4243-9027-a9aefe524c55-kube-api-access-kn8w2" (OuterVolumeSpecName: "kube-api-access-kn8w2") pod "df12e809-8735-4243-9027-a9aefe524c55" (UID: "df12e809-8735-4243-9027-a9aefe524c55"). InnerVolumeSpecName "kube-api-access-kn8w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.038355 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-inventory" (OuterVolumeSpecName: "inventory") pod "df12e809-8735-4243-9027-a9aefe524c55" (UID: "df12e809-8735-4243-9027-a9aefe524c55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.042769 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "df12e809-8735-4243-9027-a9aefe524c55" (UID: "df12e809-8735-4243-9027-a9aefe524c55"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.106858 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.106885 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn8w2\" (UniqueName: \"kubernetes.io/projected/df12e809-8735-4243-9027-a9aefe524c55-kube-api-access-kn8w2\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.106898 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/df12e809-8735-4243-9027-a9aefe524c55-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.274784 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-8bccd" event={"ID":"df12e809-8735-4243-9027-a9aefe524c55","Type":"ContainerDied","Data":"0bd995e709e6912d524191cc3ef362044ec82ccb76990d99670cf0cebbe5780c"} Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.274823 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bd995e709e6912d524191cc3ef362044ec82ccb76990d99670cf0cebbe5780c" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.274924 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-8bccd" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.395121 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-networker-tfsrs"] Feb 01 09:11:47 crc kubenswrapper[5127]: E0201 09:11:47.395794 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df12e809-8735-4243-9027-a9aefe524c55" containerName="configure-os-openstack-openstack-networker" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.395826 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="df12e809-8735-4243-9027-a9aefe524c55" containerName="configure-os-openstack-openstack-networker" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.396225 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="df12e809-8735-4243-9027-a9aefe524c55" containerName="configure-os-openstack-openstack-networker" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.397444 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.401796 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.402094 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.414300 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-tfsrs"] Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.518726 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.519311 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvfjw\" (UniqueName: \"kubernetes.io/projected/2f220fdd-7457-4ef7-8314-727210a50eda-kube-api-access-nvfjw\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.519364 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-inventory\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.621669 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.621783 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvfjw\" (UniqueName: \"kubernetes.io/projected/2f220fdd-7457-4ef7-8314-727210a50eda-kube-api-access-nvfjw\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.621829 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-inventory\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.627621 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.627729 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-inventory\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.650051 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvfjw\" (UniqueName: \"kubernetes.io/projected/2f220fdd-7457-4ef7-8314-727210a50eda-kube-api-access-nvfjw\") pod \"run-os-openstack-openstack-networker-tfsrs\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:47 crc kubenswrapper[5127]: I0201 09:11:47.727056 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:48 crc kubenswrapper[5127]: I0201 09:11:48.272939 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-tfsrs"] Feb 01 09:11:48 crc kubenswrapper[5127]: I0201 09:11:48.295753 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-tfsrs" event={"ID":"2f220fdd-7457-4ef7-8314-727210a50eda","Type":"ContainerStarted","Data":"3faa0b4d1b5543ce465bc451b7725ac0943ac252dce363c45027c6990c9d8a90"} Feb 01 09:11:49 crc kubenswrapper[5127]: I0201 09:11:49.311724 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-tfsrs" event={"ID":"2f220fdd-7457-4ef7-8314-727210a50eda","Type":"ContainerStarted","Data":"c3aa21ad0ac73f4cc969787c9023d7c99794d9a2621280da4c499134b119e563"} Feb 01 09:11:49 crc kubenswrapper[5127]: I0201 09:11:49.338668 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-networker-tfsrs" podStartSLOduration=1.659889371 podStartE2EDuration="2.338649555s" podCreationTimestamp="2026-02-01 09:11:47 +0000 UTC" firstStartedPulling="2026-02-01 09:11:48.288416355 +0000 UTC m=+8658.774318728" lastFinishedPulling="2026-02-01 09:11:48.967176549 +0000 UTC m=+8659.453078912" observedRunningTime="2026-02-01 09:11:49.326099568 +0000 UTC m=+8659.812001941" watchObservedRunningTime="2026-02-01 09:11:49.338649555 +0000 UTC m=+8659.824551928" Feb 01 09:11:58 crc kubenswrapper[5127]: I0201 09:11:58.401527 5127 generic.go:334] "Generic (PLEG): container finished" podID="2f220fdd-7457-4ef7-8314-727210a50eda" containerID="c3aa21ad0ac73f4cc969787c9023d7c99794d9a2621280da4c499134b119e563" exitCode=0 Feb 01 09:11:58 crc kubenswrapper[5127]: I0201 09:11:58.401674 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-tfsrs" event={"ID":"2f220fdd-7457-4ef7-8314-727210a50eda","Type":"ContainerDied","Data":"c3aa21ad0ac73f4cc969787c9023d7c99794d9a2621280da4c499134b119e563"} Feb 01 09:11:59 crc kubenswrapper[5127]: I0201 09:11:59.918358 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:11:59 crc kubenswrapper[5127]: I0201 09:11:59.942239 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvfjw\" (UniqueName: \"kubernetes.io/projected/2f220fdd-7457-4ef7-8314-727210a50eda-kube-api-access-nvfjw\") pod \"2f220fdd-7457-4ef7-8314-727210a50eda\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " Feb 01 09:11:59 crc kubenswrapper[5127]: I0201 09:11:59.942372 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-inventory\") pod \"2f220fdd-7457-4ef7-8314-727210a50eda\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " Feb 01 09:11:59 crc kubenswrapper[5127]: I0201 09:11:59.942468 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-ssh-key-openstack-networker\") pod \"2f220fdd-7457-4ef7-8314-727210a50eda\" (UID: \"2f220fdd-7457-4ef7-8314-727210a50eda\") " Feb 01 09:11:59 crc kubenswrapper[5127]: I0201 09:11:59.948392 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f220fdd-7457-4ef7-8314-727210a50eda-kube-api-access-nvfjw" (OuterVolumeSpecName: "kube-api-access-nvfjw") pod "2f220fdd-7457-4ef7-8314-727210a50eda" (UID: "2f220fdd-7457-4ef7-8314-727210a50eda"). InnerVolumeSpecName "kube-api-access-nvfjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:11:59 crc kubenswrapper[5127]: I0201 09:11:59.981341 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "2f220fdd-7457-4ef7-8314-727210a50eda" (UID: "2f220fdd-7457-4ef7-8314-727210a50eda"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:11:59 crc kubenswrapper[5127]: I0201 09:11:59.999815 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-inventory" (OuterVolumeSpecName: "inventory") pod "2f220fdd-7457-4ef7-8314-727210a50eda" (UID: "2f220fdd-7457-4ef7-8314-727210a50eda"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.045740 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvfjw\" (UniqueName: \"kubernetes.io/projected/2f220fdd-7457-4ef7-8314-727210a50eda-kube-api-access-nvfjw\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.045781 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.045795 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/2f220fdd-7457-4ef7-8314-727210a50eda-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.429466 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-tfsrs" event={"ID":"2f220fdd-7457-4ef7-8314-727210a50eda","Type":"ContainerDied","Data":"3faa0b4d1b5543ce465bc451b7725ac0943ac252dce363c45027c6990c9d8a90"} Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.429785 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3faa0b4d1b5543ce465bc451b7725ac0943ac252dce363c45027c6990c9d8a90" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.429575 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-tfsrs" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.519833 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-58kk7"] Feb 01 09:12:00 crc kubenswrapper[5127]: E0201 09:12:00.520358 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f220fdd-7457-4ef7-8314-727210a50eda" containerName="run-os-openstack-openstack-networker" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.520385 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f220fdd-7457-4ef7-8314-727210a50eda" containerName="run-os-openstack-openstack-networker" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.520712 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f220fdd-7457-4ef7-8314-727210a50eda" containerName="run-os-openstack-openstack-networker" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.521622 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.523462 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.524264 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.536838 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-58kk7"] Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.556870 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-inventory\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.557008 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.557225 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5npf\" (UniqueName: \"kubernetes.io/projected/48821e45-5c97-4162-8c43-259f2d4b6a7c-kube-api-access-q5npf\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.662556 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-inventory\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.662641 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.662818 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5npf\" (UniqueName: \"kubernetes.io/projected/48821e45-5c97-4162-8c43-259f2d4b6a7c-kube-api-access-q5npf\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.670169 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.677543 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-inventory\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.688038 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5npf\" (UniqueName: \"kubernetes.io/projected/48821e45-5c97-4162-8c43-259f2d4b6a7c-kube-api-access-q5npf\") pod \"reboot-os-openstack-openstack-networker-58kk7\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:00 crc kubenswrapper[5127]: I0201 09:12:00.854334 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:01 crc kubenswrapper[5127]: I0201 09:12:01.443127 5127 generic.go:334] "Generic (PLEG): container finished" podID="cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" containerID="a3a05301561dd4ab015efddbccf3d2ee9a4650531e96596915cf9405e3637e75" exitCode=0 Feb 01 09:12:01 crc kubenswrapper[5127]: I0201 09:12:01.443178 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" event={"ID":"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e","Type":"ContainerDied","Data":"a3a05301561dd4ab015efddbccf3d2ee9a4650531e96596915cf9405e3637e75"} Feb 01 09:12:01 crc kubenswrapper[5127]: I0201 09:12:01.448326 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-58kk7"] Feb 01 09:12:02 crc kubenswrapper[5127]: I0201 09:12:02.466027 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" event={"ID":"48821e45-5c97-4162-8c43-259f2d4b6a7c","Type":"ContainerStarted","Data":"cd7227f9b690a1851f64f35cec845b2fee7fcab734888886d502e596d4ed3454"} Feb 01 09:12:02 crc kubenswrapper[5127]: I0201 09:12:02.466719 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" event={"ID":"48821e45-5c97-4162-8c43-259f2d4b6a7c","Type":"ContainerStarted","Data":"403494cd3687beab215bcc9bf0e1016de6f228e0cee46f572653ec1a031dd3e9"} Feb 01 09:12:02 crc kubenswrapper[5127]: I0201 09:12:02.497707 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" podStartSLOduration=1.930028479 podStartE2EDuration="2.497674062s" podCreationTimestamp="2026-02-01 09:12:00 +0000 UTC" firstStartedPulling="2026-02-01 09:12:01.466437471 +0000 UTC m=+8671.952339864" lastFinishedPulling="2026-02-01 09:12:02.034083054 +0000 UTC m=+8672.519985447" observedRunningTime="2026-02-01 09:12:02.489169585 +0000 UTC m=+8672.975071978" watchObservedRunningTime="2026-02-01 09:12:02.497674062 +0000 UTC m=+8672.983576465" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.077048 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.137107 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ceph\") pod \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.137156 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ssh-key-openstack-cell1\") pod \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.137401 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-inventory\") pod \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.137435 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc8kl\" (UniqueName: \"kubernetes.io/projected/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-kube-api-access-hc8kl\") pod \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\" (UID: \"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e\") " Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.144784 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ceph" (OuterVolumeSpecName: "ceph") pod "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" (UID: "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.149961 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-kube-api-access-hc8kl" (OuterVolumeSpecName: "kube-api-access-hc8kl") pod "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" (UID: "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e"). InnerVolumeSpecName "kube-api-access-hc8kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.170958 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" (UID: "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.171302 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-inventory" (OuterVolumeSpecName: "inventory") pod "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" (UID: "cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.240709 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.240734 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.240746 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.240754 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc8kl\" (UniqueName: \"kubernetes.io/projected/cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e-kube-api-access-hc8kl\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.479669 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.479745 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hfch5" event={"ID":"cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e","Type":"ContainerDied","Data":"d20019750d45b139a649428fb4bd79768397b380305b4638966daec35fd22485"} Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.480775 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d20019750d45b139a649428fb4bd79768397b380305b4638966daec35fd22485" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.553349 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-fjs75"] Feb 01 09:12:03 crc kubenswrapper[5127]: E0201 09:12:03.553870 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" containerName="configure-os-openstack-openstack-cell1" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.553888 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" containerName="configure-os-openstack-openstack-cell1" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.554073 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e" containerName="configure-os-openstack-openstack-cell1" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.554847 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.556918 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.557332 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.584154 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-fjs75"] Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.648273 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.648729 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-1\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.648809 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78kk9\" (UniqueName: \"kubernetes.io/projected/7cf892b5-5638-474a-b426-aa1d7b4952af-kube-api-access-78kk9\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.648841 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ceph\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.648946 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-0\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.648977 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.750053 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.750227 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-1\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.750263 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78kk9\" (UniqueName: \"kubernetes.io/projected/7cf892b5-5638-474a-b426-aa1d7b4952af-kube-api-access-78kk9\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.750295 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ceph\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.750341 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-0\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.750371 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.753816 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.754246 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-1\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.754284 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-0\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.759027 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.767525 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ceph\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.778374 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78kk9\" (UniqueName: \"kubernetes.io/projected/7cf892b5-5638-474a-b426-aa1d7b4952af-kube-api-access-78kk9\") pod \"ssh-known-hosts-openstack-fjs75\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:03 crc kubenswrapper[5127]: I0201 09:12:03.890084 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:04 crc kubenswrapper[5127]: I0201 09:12:04.475397 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-fjs75"] Feb 01 09:12:04 crc kubenswrapper[5127]: I0201 09:12:04.497770 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fjs75" event={"ID":"7cf892b5-5638-474a-b426-aa1d7b4952af","Type":"ContainerStarted","Data":"428d019b4a2e7ef16240004360691e03a5d51387f52a1d03a29d538cc0fea910"} Feb 01 09:12:05 crc kubenswrapper[5127]: I0201 09:12:05.516410 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fjs75" event={"ID":"7cf892b5-5638-474a-b426-aa1d7b4952af","Type":"ContainerStarted","Data":"80c9d2bd5f44ac48f1bf4e55c522c5eb797e25b35b21a09fe2a59ff75affee46"} Feb 01 09:12:06 crc kubenswrapper[5127]: I0201 09:12:06.741278 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:12:06 crc kubenswrapper[5127]: I0201 09:12:06.741784 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:12:06 crc kubenswrapper[5127]: I0201 09:12:06.741830 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:12:06 crc kubenswrapper[5127]: I0201 09:12:06.742332 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c485e70a9873e3942903d5d8141fd5764a60f96cd54d7e2f63e9fc092f3df951"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:12:06 crc kubenswrapper[5127]: I0201 09:12:06.742387 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://c485e70a9873e3942903d5d8141fd5764a60f96cd54d7e2f63e9fc092f3df951" gracePeriod=600 Feb 01 09:12:07 crc kubenswrapper[5127]: I0201 09:12:07.556277 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="c485e70a9873e3942903d5d8141fd5764a60f96cd54d7e2f63e9fc092f3df951" exitCode=0 Feb 01 09:12:07 crc kubenswrapper[5127]: I0201 09:12:07.556407 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"c485e70a9873e3942903d5d8141fd5764a60f96cd54d7e2f63e9fc092f3df951"} Feb 01 09:12:07 crc kubenswrapper[5127]: I0201 09:12:07.556655 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b"} Feb 01 09:12:07 crc kubenswrapper[5127]: I0201 09:12:07.556683 5127 scope.go:117] "RemoveContainer" containerID="dcbe462255c9283899bad30c6484378963303e49f027bafb1e214fa1a0488929" Feb 01 09:12:07 crc kubenswrapper[5127]: I0201 09:12:07.587153 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-fjs75" podStartSLOduration=4.053267224 podStartE2EDuration="4.587127633s" podCreationTimestamp="2026-02-01 09:12:03 +0000 UTC" firstStartedPulling="2026-02-01 09:12:04.478307306 +0000 UTC m=+8674.964209689" lastFinishedPulling="2026-02-01 09:12:05.012167735 +0000 UTC m=+8675.498070098" observedRunningTime="2026-02-01 09:12:05.537437523 +0000 UTC m=+8676.023339886" watchObservedRunningTime="2026-02-01 09:12:07.587127633 +0000 UTC m=+8678.073029996" Feb 01 09:12:17 crc kubenswrapper[5127]: I0201 09:12:17.665406 5127 generic.go:334] "Generic (PLEG): container finished" podID="48821e45-5c97-4162-8c43-259f2d4b6a7c" containerID="cd7227f9b690a1851f64f35cec845b2fee7fcab734888886d502e596d4ed3454" exitCode=0 Feb 01 09:12:17 crc kubenswrapper[5127]: I0201 09:12:17.665516 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" event={"ID":"48821e45-5c97-4162-8c43-259f2d4b6a7c","Type":"ContainerDied","Data":"cd7227f9b690a1851f64f35cec845b2fee7fcab734888886d502e596d4ed3454"} Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.233987 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.336725 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5npf\" (UniqueName: \"kubernetes.io/projected/48821e45-5c97-4162-8c43-259f2d4b6a7c-kube-api-access-q5npf\") pod \"48821e45-5c97-4162-8c43-259f2d4b6a7c\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.336821 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-ssh-key-openstack-networker\") pod \"48821e45-5c97-4162-8c43-259f2d4b6a7c\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.336994 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-inventory\") pod \"48821e45-5c97-4162-8c43-259f2d4b6a7c\" (UID: \"48821e45-5c97-4162-8c43-259f2d4b6a7c\") " Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.343299 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48821e45-5c97-4162-8c43-259f2d4b6a7c-kube-api-access-q5npf" (OuterVolumeSpecName: "kube-api-access-q5npf") pod "48821e45-5c97-4162-8c43-259f2d4b6a7c" (UID: "48821e45-5c97-4162-8c43-259f2d4b6a7c"). InnerVolumeSpecName "kube-api-access-q5npf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.364389 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "48821e45-5c97-4162-8c43-259f2d4b6a7c" (UID: "48821e45-5c97-4162-8c43-259f2d4b6a7c"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.391852 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-inventory" (OuterVolumeSpecName: "inventory") pod "48821e45-5c97-4162-8c43-259f2d4b6a7c" (UID: "48821e45-5c97-4162-8c43-259f2d4b6a7c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.441385 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.441445 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5npf\" (UniqueName: \"kubernetes.io/projected/48821e45-5c97-4162-8c43-259f2d4b6a7c-kube-api-access-q5npf\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.441467 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/48821e45-5c97-4162-8c43-259f2d4b6a7c-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.689356 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" event={"ID":"48821e45-5c97-4162-8c43-259f2d4b6a7c","Type":"ContainerDied","Data":"403494cd3687beab215bcc9bf0e1016de6f228e0cee46f572653ec1a031dd3e9"} Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.689866 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="403494cd3687beab215bcc9bf0e1016de6f228e0cee46f572653ec1a031dd3e9" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.689873 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-58kk7" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.828502 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-networker-52rqc"] Feb 01 09:12:19 crc kubenswrapper[5127]: E0201 09:12:19.829324 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48821e45-5c97-4162-8c43-259f2d4b6a7c" containerName="reboot-os-openstack-openstack-networker" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.829446 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="48821e45-5c97-4162-8c43-259f2d4b6a7c" containerName="reboot-os-openstack-openstack-networker" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.829814 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="48821e45-5c97-4162-8c43-259f2d4b6a7c" containerName="reboot-os-openstack-openstack-networker" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.830822 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.834281 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.845573 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-52rqc"] Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.955083 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzxv\" (UniqueName: \"kubernetes.io/projected/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-kube-api-access-qtzxv\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.955202 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-inventory\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.955504 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.955710 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.955748 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:19 crc kubenswrapper[5127]: I0201 09:12:19.955767 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.057968 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.058028 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.058047 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.058154 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzxv\" (UniqueName: \"kubernetes.io/projected/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-kube-api-access-qtzxv\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.058195 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-inventory\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.058231 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.063040 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.063451 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.063654 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.064128 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.072805 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-inventory\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.088871 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzxv\" (UniqueName: \"kubernetes.io/projected/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-kube-api-access-qtzxv\") pod \"install-certs-openstack-openstack-networker-52rqc\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.156785 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.709130 5127 generic.go:334] "Generic (PLEG): container finished" podID="7cf892b5-5638-474a-b426-aa1d7b4952af" containerID="80c9d2bd5f44ac48f1bf4e55c522c5eb797e25b35b21a09fe2a59ff75affee46" exitCode=0 Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.709259 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fjs75" event={"ID":"7cf892b5-5638-474a-b426-aa1d7b4952af","Type":"ContainerDied","Data":"80c9d2bd5f44ac48f1bf4e55c522c5eb797e25b35b21a09fe2a59ff75affee46"} Feb 01 09:12:20 crc kubenswrapper[5127]: I0201 09:12:20.814133 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-52rqc"] Feb 01 09:12:20 crc kubenswrapper[5127]: W0201 09:12:20.823659 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eb8c388_3713_407f_8d2c_c8e5bdc446fb.slice/crio-3afcc93bec81613c4ef30cf9226e4c2266ea35d2651b96920154b5fe7c04803f WatchSource:0}: Error finding container 3afcc93bec81613c4ef30cf9226e4c2266ea35d2651b96920154b5fe7c04803f: Status 404 returned error can't find the container with id 3afcc93bec81613c4ef30cf9226e4c2266ea35d2651b96920154b5fe7c04803f Feb 01 09:12:21 crc kubenswrapper[5127]: I0201 09:12:21.720972 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-52rqc" event={"ID":"9eb8c388-3713-407f-8d2c-c8e5bdc446fb","Type":"ContainerStarted","Data":"bd086e8eaec1e8023bee61831021e3769455e86c1e203ae0777db687a4f6c497"} Feb 01 09:12:21 crc kubenswrapper[5127]: I0201 09:12:21.721672 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-52rqc" event={"ID":"9eb8c388-3713-407f-8d2c-c8e5bdc446fb","Type":"ContainerStarted","Data":"3afcc93bec81613c4ef30cf9226e4c2266ea35d2651b96920154b5fe7c04803f"} Feb 01 09:12:21 crc kubenswrapper[5127]: I0201 09:12:21.748320 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-networker-52rqc" podStartSLOduration=2.351201163 podStartE2EDuration="2.748295333s" podCreationTimestamp="2026-02-01 09:12:19 +0000 UTC" firstStartedPulling="2026-02-01 09:12:20.827789344 +0000 UTC m=+8691.313691717" lastFinishedPulling="2026-02-01 09:12:21.224883484 +0000 UTC m=+8691.710785887" observedRunningTime="2026-02-01 09:12:21.744208954 +0000 UTC m=+8692.230111327" watchObservedRunningTime="2026-02-01 09:12:21.748295333 +0000 UTC m=+8692.234197706" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.186513 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.305400 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-1\") pod \"7cf892b5-5638-474a-b426-aa1d7b4952af\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.305478 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ceph\") pod \"7cf892b5-5638-474a-b426-aa1d7b4952af\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.305658 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-networker\") pod \"7cf892b5-5638-474a-b426-aa1d7b4952af\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.305808 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-0\") pod \"7cf892b5-5638-474a-b426-aa1d7b4952af\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.305937 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78kk9\" (UniqueName: \"kubernetes.io/projected/7cf892b5-5638-474a-b426-aa1d7b4952af-kube-api-access-78kk9\") pod \"7cf892b5-5638-474a-b426-aa1d7b4952af\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.305985 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-cell1\") pod \"7cf892b5-5638-474a-b426-aa1d7b4952af\" (UID: \"7cf892b5-5638-474a-b426-aa1d7b4952af\") " Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.311277 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ceph" (OuterVolumeSpecName: "ceph") pod "7cf892b5-5638-474a-b426-aa1d7b4952af" (UID: "7cf892b5-5638-474a-b426-aa1d7b4952af"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.311778 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf892b5-5638-474a-b426-aa1d7b4952af-kube-api-access-78kk9" (OuterVolumeSpecName: "kube-api-access-78kk9") pod "7cf892b5-5638-474a-b426-aa1d7b4952af" (UID: "7cf892b5-5638-474a-b426-aa1d7b4952af"). InnerVolumeSpecName "kube-api-access-78kk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.335305 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "7cf892b5-5638-474a-b426-aa1d7b4952af" (UID: "7cf892b5-5638-474a-b426-aa1d7b4952af"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.338768 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7cf892b5-5638-474a-b426-aa1d7b4952af" (UID: "7cf892b5-5638-474a-b426-aa1d7b4952af"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.346687 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "7cf892b5-5638-474a-b426-aa1d7b4952af" (UID: "7cf892b5-5638-474a-b426-aa1d7b4952af"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.371346 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "7cf892b5-5638-474a-b426-aa1d7b4952af" (UID: "7cf892b5-5638-474a-b426-aa1d7b4952af"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.408134 5127 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.408436 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78kk9\" (UniqueName: \"kubernetes.io/projected/7cf892b5-5638-474a-b426-aa1d7b4952af-kube-api-access-78kk9\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.408449 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.408457 5127 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-inventory-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.408467 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.408477 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7cf892b5-5638-474a-b426-aa1d7b4952af-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.734788 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-fjs75" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.734778 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-fjs75" event={"ID":"7cf892b5-5638-474a-b426-aa1d7b4952af","Type":"ContainerDied","Data":"428d019b4a2e7ef16240004360691e03a5d51387f52a1d03a29d538cc0fea910"} Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.734983 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="428d019b4a2e7ef16240004360691e03a5d51387f52a1d03a29d538cc0fea910" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.870615 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-95565"] Feb 01 09:12:22 crc kubenswrapper[5127]: E0201 09:12:22.871694 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf892b5-5638-474a-b426-aa1d7b4952af" containerName="ssh-known-hosts-openstack" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.871839 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf892b5-5638-474a-b426-aa1d7b4952af" containerName="ssh-known-hosts-openstack" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.872458 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf892b5-5638-474a-b426-aa1d7b4952af" containerName="ssh-known-hosts-openstack" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.873816 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.880064 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.880567 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:12:22 crc kubenswrapper[5127]: I0201 09:12:22.912943 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-95565"] Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.026701 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.026896 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mpt2\" (UniqueName: \"kubernetes.io/projected/0b38547b-b0d5-4063-8e6b-3dfe5977677f-kube-api-access-6mpt2\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.026958 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-inventory\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.026978 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ceph\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.128940 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.129132 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mpt2\" (UniqueName: \"kubernetes.io/projected/0b38547b-b0d5-4063-8e6b-3dfe5977677f-kube-api-access-6mpt2\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.129195 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-inventory\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.129222 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ceph\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.133745 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ceph\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.133876 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.138573 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-inventory\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.148813 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mpt2\" (UniqueName: \"kubernetes.io/projected/0b38547b-b0d5-4063-8e6b-3dfe5977677f-kube-api-access-6mpt2\") pod \"run-os-openstack-openstack-cell1-95565\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.214424 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:23 crc kubenswrapper[5127]: I0201 09:12:23.777285 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-95565"] Feb 01 09:12:23 crc kubenswrapper[5127]: W0201 09:12:23.784699 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b38547b_b0d5_4063_8e6b_3dfe5977677f.slice/crio-e4e2729e62274506809d58b5d96a26cac6f28028f9782660011a7788c922db76 WatchSource:0}: Error finding container e4e2729e62274506809d58b5d96a26cac6f28028f9782660011a7788c922db76: Status 404 returned error can't find the container with id e4e2729e62274506809d58b5d96a26cac6f28028f9782660011a7788c922db76 Feb 01 09:12:24 crc kubenswrapper[5127]: I0201 09:12:24.771438 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-95565" event={"ID":"0b38547b-b0d5-4063-8e6b-3dfe5977677f","Type":"ContainerStarted","Data":"6caee6db26b081f66a519815092239b31f8bd07ddb63aa17263383d0d2b7c587"} Feb 01 09:12:24 crc kubenswrapper[5127]: I0201 09:12:24.772128 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-95565" event={"ID":"0b38547b-b0d5-4063-8e6b-3dfe5977677f","Type":"ContainerStarted","Data":"e4e2729e62274506809d58b5d96a26cac6f28028f9782660011a7788c922db76"} Feb 01 09:12:24 crc kubenswrapper[5127]: I0201 09:12:24.797756 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-95565" podStartSLOduration=2.369661613 podStartE2EDuration="2.797735212s" podCreationTimestamp="2026-02-01 09:12:22 +0000 UTC" firstStartedPulling="2026-02-01 09:12:23.78812718 +0000 UTC m=+8694.274029543" lastFinishedPulling="2026-02-01 09:12:24.216200779 +0000 UTC m=+8694.702103142" observedRunningTime="2026-02-01 09:12:24.78865138 +0000 UTC m=+8695.274553773" watchObservedRunningTime="2026-02-01 09:12:24.797735212 +0000 UTC m=+8695.283637575" Feb 01 09:12:32 crc kubenswrapper[5127]: I0201 09:12:32.864735 5127 generic.go:334] "Generic (PLEG): container finished" podID="9eb8c388-3713-407f-8d2c-c8e5bdc446fb" containerID="bd086e8eaec1e8023bee61831021e3769455e86c1e203ae0777db687a4f6c497" exitCode=0 Feb 01 09:12:32 crc kubenswrapper[5127]: I0201 09:12:32.864878 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-52rqc" event={"ID":"9eb8c388-3713-407f-8d2c-c8e5bdc446fb","Type":"ContainerDied","Data":"bd086e8eaec1e8023bee61831021e3769455e86c1e203ae0777db687a4f6c497"} Feb 01 09:12:32 crc kubenswrapper[5127]: I0201 09:12:32.868964 5127 generic.go:334] "Generic (PLEG): container finished" podID="0b38547b-b0d5-4063-8e6b-3dfe5977677f" containerID="6caee6db26b081f66a519815092239b31f8bd07ddb63aa17263383d0d2b7c587" exitCode=0 Feb 01 09:12:32 crc kubenswrapper[5127]: I0201 09:12:32.869011 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-95565" event={"ID":"0b38547b-b0d5-4063-8e6b-3dfe5977677f","Type":"ContainerDied","Data":"6caee6db26b081f66a519815092239b31f8bd07ddb63aa17263383d0d2b7c587"} Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.432379 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.599800 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ceph\") pod \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.599954 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ssh-key-openstack-cell1\") pod \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.600164 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-inventory\") pod \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.600252 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mpt2\" (UniqueName: \"kubernetes.io/projected/0b38547b-b0d5-4063-8e6b-3dfe5977677f-kube-api-access-6mpt2\") pod \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\" (UID: \"0b38547b-b0d5-4063-8e6b-3dfe5977677f\") " Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.608420 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ceph" (OuterVolumeSpecName: "ceph") pod "0b38547b-b0d5-4063-8e6b-3dfe5977677f" (UID: "0b38547b-b0d5-4063-8e6b-3dfe5977677f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.608890 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b38547b-b0d5-4063-8e6b-3dfe5977677f-kube-api-access-6mpt2" (OuterVolumeSpecName: "kube-api-access-6mpt2") pod "0b38547b-b0d5-4063-8e6b-3dfe5977677f" (UID: "0b38547b-b0d5-4063-8e6b-3dfe5977677f"). InnerVolumeSpecName "kube-api-access-6mpt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.632330 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-inventory" (OuterVolumeSpecName: "inventory") pod "0b38547b-b0d5-4063-8e6b-3dfe5977677f" (UID: "0b38547b-b0d5-4063-8e6b-3dfe5977677f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.634357 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0b38547b-b0d5-4063-8e6b-3dfe5977677f" (UID: "0b38547b-b0d5-4063-8e6b-3dfe5977677f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.702588 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.702627 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mpt2\" (UniqueName: \"kubernetes.io/projected/0b38547b-b0d5-4063-8e6b-3dfe5977677f-kube-api-access-6mpt2\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.702640 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.702648 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0b38547b-b0d5-4063-8e6b-3dfe5977677f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.896557 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-95565" event={"ID":"0b38547b-b0d5-4063-8e6b-3dfe5977677f","Type":"ContainerDied","Data":"e4e2729e62274506809d58b5d96a26cac6f28028f9782660011a7788c922db76"} Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.896709 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4e2729e62274506809d58b5d96a26cac6f28028f9782660011a7788c922db76" Feb 01 09:12:34 crc kubenswrapper[5127]: I0201 09:12:34.896729 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-95565" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.028320 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdbff"] Feb 01 09:12:35 crc kubenswrapper[5127]: E0201 09:12:35.028936 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b38547b-b0d5-4063-8e6b-3dfe5977677f" containerName="run-os-openstack-openstack-cell1" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.028955 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b38547b-b0d5-4063-8e6b-3dfe5977677f" containerName="run-os-openstack-openstack-cell1" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.029225 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b38547b-b0d5-4063-8e6b-3dfe5977677f" containerName="run-os-openstack-openstack-cell1" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.030200 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.032479 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.034274 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.040843 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdbff"] Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.119994 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz99r\" (UniqueName: \"kubernetes.io/projected/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-kube-api-access-wz99r\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.120189 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.120290 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.120334 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.221917 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.222002 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.222033 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.222064 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz99r\" (UniqueName: \"kubernetes.io/projected/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-kube-api-access-wz99r\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.227407 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.239367 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-inventory\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.239629 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ceph\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.243166 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz99r\" (UniqueName: \"kubernetes.io/projected/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-kube-api-access-wz99r\") pod \"reboot-os-openstack-openstack-cell1-bdbff\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.357082 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.368655 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.424913 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-inventory\") pod \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.425404 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-bootstrap-combined-ca-bundle\") pod \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.425516 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-neutron-metadata-combined-ca-bundle\") pod \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.425628 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ssh-key-openstack-networker\") pod \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.425709 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtzxv\" (UniqueName: \"kubernetes.io/projected/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-kube-api-access-qtzxv\") pod \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.425785 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ovn-combined-ca-bundle\") pod \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\" (UID: \"9eb8c388-3713-407f-8d2c-c8e5bdc446fb\") " Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.430379 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9eb8c388-3713-407f-8d2c-c8e5bdc446fb" (UID: "9eb8c388-3713-407f-8d2c-c8e5bdc446fb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.430875 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-kube-api-access-qtzxv" (OuterVolumeSpecName: "kube-api-access-qtzxv") pod "9eb8c388-3713-407f-8d2c-c8e5bdc446fb" (UID: "9eb8c388-3713-407f-8d2c-c8e5bdc446fb"). InnerVolumeSpecName "kube-api-access-qtzxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.430901 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9eb8c388-3713-407f-8d2c-c8e5bdc446fb" (UID: "9eb8c388-3713-407f-8d2c-c8e5bdc446fb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.432419 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9eb8c388-3713-407f-8d2c-c8e5bdc446fb" (UID: "9eb8c388-3713-407f-8d2c-c8e5bdc446fb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.458017 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-inventory" (OuterVolumeSpecName: "inventory") pod "9eb8c388-3713-407f-8d2c-c8e5bdc446fb" (UID: "9eb8c388-3713-407f-8d2c-c8e5bdc446fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.490090 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "9eb8c388-3713-407f-8d2c-c8e5bdc446fb" (UID: "9eb8c388-3713-407f-8d2c-c8e5bdc446fb"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.528859 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.528896 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.528907 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtzxv\" (UniqueName: \"kubernetes.io/projected/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-kube-api-access-qtzxv\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.528917 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.528927 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.528936 5127 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eb8c388-3713-407f-8d2c-c8e5bdc446fb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.909988 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-52rqc" event={"ID":"9eb8c388-3713-407f-8d2c-c8e5bdc446fb","Type":"ContainerDied","Data":"3afcc93bec81613c4ef30cf9226e4c2266ea35d2651b96920154b5fe7c04803f"} Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.910328 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3afcc93bec81613c4ef30cf9226e4c2266ea35d2651b96920154b5fe7c04803f" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.910032 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-52rqc" Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.971734 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-bdbff"] Feb 01 09:12:35 crc kubenswrapper[5127]: W0201 09:12:35.971783 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94b4f8d3_30e6_4ded_8c7b_6efec8dc4f69.slice/crio-81f7a8c28b63bc5ce3ed0422b51e576c747a5706b22f62841d37135142326547 WatchSource:0}: Error finding container 81f7a8c28b63bc5ce3ed0422b51e576c747a5706b22f62841d37135142326547: Status 404 returned error can't find the container with id 81f7a8c28b63bc5ce3ed0422b51e576c747a5706b22f62841d37135142326547 Feb 01 09:12:35 crc kubenswrapper[5127]: I0201 09:12:35.976206 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.492212 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-networker-dkmwg"] Feb 01 09:12:36 crc kubenswrapper[5127]: E0201 09:12:36.493216 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eb8c388-3713-407f-8d2c-c8e5bdc446fb" containerName="install-certs-openstack-openstack-networker" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.493231 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eb8c388-3713-407f-8d2c-c8e5bdc446fb" containerName="install-certs-openstack-openstack-networker" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.493469 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eb8c388-3713-407f-8d2c-c8e5bdc446fb" containerName="install-certs-openstack-openstack-networker" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.494478 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.497027 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.498540 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.501019 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.508156 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-dkmwg"] Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.559675 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.559816 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.559873 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.560003 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-inventory\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.560061 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2mr\" (UniqueName: \"kubernetes.io/projected/c0c7f749-f26b-40b4-bbc1-38446be4a68d-kube-api-access-4q2mr\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.662198 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.662265 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.662306 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-inventory\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.662341 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2mr\" (UniqueName: \"kubernetes.io/projected/c0c7f749-f26b-40b4-bbc1-38446be4a68d-kube-api-access-4q2mr\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.662467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.663891 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.667169 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.667388 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-inventory\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.672691 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.685541 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2mr\" (UniqueName: \"kubernetes.io/projected/c0c7f749-f26b-40b4-bbc1-38446be4a68d-kube-api-access-4q2mr\") pod \"ovn-openstack-openstack-networker-dkmwg\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.915407 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.925505 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" event={"ID":"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69","Type":"ContainerStarted","Data":"e3e7aa47da44a2a516669933e64f8c7f781de3674152d3d682bdaf84af155691"} Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.925553 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" event={"ID":"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69","Type":"ContainerStarted","Data":"81f7a8c28b63bc5ce3ed0422b51e576c747a5706b22f62841d37135142326547"} Feb 01 09:12:36 crc kubenswrapper[5127]: I0201 09:12:36.947028 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" podStartSLOduration=2.547023791 podStartE2EDuration="2.947007038s" podCreationTimestamp="2026-02-01 09:12:34 +0000 UTC" firstStartedPulling="2026-02-01 09:12:35.975867684 +0000 UTC m=+8706.461770057" lastFinishedPulling="2026-02-01 09:12:36.375850941 +0000 UTC m=+8706.861753304" observedRunningTime="2026-02-01 09:12:36.945146839 +0000 UTC m=+8707.431049232" watchObservedRunningTime="2026-02-01 09:12:36.947007038 +0000 UTC m=+8707.432909411" Feb 01 09:12:37 crc kubenswrapper[5127]: I0201 09:12:37.529411 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-dkmwg"] Feb 01 09:12:37 crc kubenswrapper[5127]: W0201 09:12:37.534135 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0c7f749_f26b_40b4_bbc1_38446be4a68d.slice/crio-bdf9393e2386257c51bf4c37f2d22ecb1d2d13b889d96b294526c102b71d322b WatchSource:0}: Error finding container bdf9393e2386257c51bf4c37f2d22ecb1d2d13b889d96b294526c102b71d322b: Status 404 returned error can't find the container with id bdf9393e2386257c51bf4c37f2d22ecb1d2d13b889d96b294526c102b71d322b Feb 01 09:12:37 crc kubenswrapper[5127]: I0201 09:12:37.941864 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-dkmwg" event={"ID":"c0c7f749-f26b-40b4-bbc1-38446be4a68d","Type":"ContainerStarted","Data":"bdf9393e2386257c51bf4c37f2d22ecb1d2d13b889d96b294526c102b71d322b"} Feb 01 09:12:38 crc kubenswrapper[5127]: I0201 09:12:38.969659 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-dkmwg" event={"ID":"c0c7f749-f26b-40b4-bbc1-38446be4a68d","Type":"ContainerStarted","Data":"83b121c5b35bf84ba22ef088235c51d1a65ab6d62140972cc25455f810abb099"} Feb 01 09:12:39 crc kubenswrapper[5127]: I0201 09:12:39.002993 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-networker-dkmwg" podStartSLOduration=2.5234930220000003 podStartE2EDuration="3.002971236s" podCreationTimestamp="2026-02-01 09:12:36 +0000 UTC" firstStartedPulling="2026-02-01 09:12:37.537180312 +0000 UTC m=+8708.023082675" lastFinishedPulling="2026-02-01 09:12:38.016658526 +0000 UTC m=+8708.502560889" observedRunningTime="2026-02-01 09:12:38.994356405 +0000 UTC m=+8709.480258768" watchObservedRunningTime="2026-02-01 09:12:39.002971236 +0000 UTC m=+8709.488873599" Feb 01 09:12:51 crc kubenswrapper[5127]: I0201 09:12:51.105200 5127 generic.go:334] "Generic (PLEG): container finished" podID="94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" containerID="e3e7aa47da44a2a516669933e64f8c7f781de3674152d3d682bdaf84af155691" exitCode=0 Feb 01 09:12:51 crc kubenswrapper[5127]: I0201 09:12:51.105249 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" event={"ID":"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69","Type":"ContainerDied","Data":"e3e7aa47da44a2a516669933e64f8c7f781de3674152d3d682bdaf84af155691"} Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.642683 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.776485 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ceph\") pod \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.776646 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ssh-key-openstack-cell1\") pod \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.776716 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz99r\" (UniqueName: \"kubernetes.io/projected/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-kube-api-access-wz99r\") pod \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.776864 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-inventory\") pod \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\" (UID: \"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69\") " Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.782833 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ceph" (OuterVolumeSpecName: "ceph") pod "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" (UID: "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.786309 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-kube-api-access-wz99r" (OuterVolumeSpecName: "kube-api-access-wz99r") pod "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" (UID: "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69"). InnerVolumeSpecName "kube-api-access-wz99r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.807637 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" (UID: "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.826967 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-inventory" (OuterVolumeSpecName: "inventory") pod "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" (UID: "94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.879041 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.879078 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.879092 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz99r\" (UniqueName: \"kubernetes.io/projected/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-kube-api-access-wz99r\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:52 crc kubenswrapper[5127]: I0201 09:12:52.879103 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.135121 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" event={"ID":"94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69","Type":"ContainerDied","Data":"81f7a8c28b63bc5ce3ed0422b51e576c747a5706b22f62841d37135142326547"} Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.135177 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f7a8c28b63bc5ce3ed0422b51e576c747a5706b22f62841d37135142326547" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.135191 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-bdbff" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.237330 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rccdw"] Feb 01 09:12:53 crc kubenswrapper[5127]: E0201 09:12:53.237924 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" containerName="reboot-os-openstack-openstack-cell1" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.237950 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" containerName="reboot-os-openstack-openstack-cell1" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.238323 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69" containerName="reboot-os-openstack-openstack-cell1" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.239449 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.242257 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.242502 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.249061 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rccdw"] Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.389370 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.389880 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.389926 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.389996 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9r79\" (UniqueName: \"kubernetes.io/projected/f36acb8e-4dbb-4655-9911-5c30e71c1287-kube-api-access-k9r79\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390032 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ceph\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390124 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390165 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390204 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390288 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-inventory\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390375 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390410 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.390450 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492286 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492367 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492433 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492502 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492537 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9r79\" (UniqueName: \"kubernetes.io/projected/f36acb8e-4dbb-4655-9911-5c30e71c1287-kube-api-access-k9r79\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492571 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ceph\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492656 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492690 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492717 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492777 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-inventory\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.492847 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.496675 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.497401 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-inventory\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.497474 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.498054 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ceph\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.498149 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.499435 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.510477 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.510715 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.511661 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.511742 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.515531 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.526883 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9r79\" (UniqueName: \"kubernetes.io/projected/f36acb8e-4dbb-4655-9911-5c30e71c1287-kube-api-access-k9r79\") pod \"install-certs-openstack-openstack-cell1-rccdw\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:53 crc kubenswrapper[5127]: I0201 09:12:53.555071 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:12:54 crc kubenswrapper[5127]: I0201 09:12:54.181314 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rccdw"] Feb 01 09:12:55 crc kubenswrapper[5127]: I0201 09:12:55.157762 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" event={"ID":"f36acb8e-4dbb-4655-9911-5c30e71c1287","Type":"ContainerStarted","Data":"0e3c78c572b02a7612f8cc2f0528fe483431a76b953ecb30a2de4b75e8fec972"} Feb 01 09:12:55 crc kubenswrapper[5127]: I0201 09:12:55.158314 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" event={"ID":"f36acb8e-4dbb-4655-9911-5c30e71c1287","Type":"ContainerStarted","Data":"7951f36a3d0e3029437e2591db9bd4948ae349eacacae5a7a7f9ec563ea2d32e"} Feb 01 09:12:55 crc kubenswrapper[5127]: I0201 09:12:55.189185 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" podStartSLOduration=1.771940558 podStartE2EDuration="2.189162037s" podCreationTimestamp="2026-02-01 09:12:53 +0000 UTC" firstStartedPulling="2026-02-01 09:12:54.193146979 +0000 UTC m=+8724.679049362" lastFinishedPulling="2026-02-01 09:12:54.610368438 +0000 UTC m=+8725.096270841" observedRunningTime="2026-02-01 09:12:55.178084341 +0000 UTC m=+8725.663986704" watchObservedRunningTime="2026-02-01 09:12:55.189162037 +0000 UTC m=+8725.675064420" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.464695 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8m95f"] Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.468152 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.486001 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8m95f"] Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.635373 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-catalog-content\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.635439 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6fkj\" (UniqueName: \"kubernetes.io/projected/ff5ab748-f68f-4d4d-801c-0079fd85ff74-kube-api-access-m6fkj\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.635465 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-utilities\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.737136 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-catalog-content\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.737208 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6fkj\" (UniqueName: \"kubernetes.io/projected/ff5ab748-f68f-4d4d-801c-0079fd85ff74-kube-api-access-m6fkj\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.737230 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-utilities\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.737722 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-catalog-content\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.737772 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-utilities\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.761756 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6fkj\" (UniqueName: \"kubernetes.io/projected/ff5ab748-f68f-4d4d-801c-0079fd85ff74-kube-api-access-m6fkj\") pod \"redhat-operators-8m95f\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:11 crc kubenswrapper[5127]: I0201 09:13:11.784368 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:12 crc kubenswrapper[5127]: I0201 09:13:12.329221 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8m95f"] Feb 01 09:13:12 crc kubenswrapper[5127]: I0201 09:13:12.356116 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m95f" event={"ID":"ff5ab748-f68f-4d4d-801c-0079fd85ff74","Type":"ContainerStarted","Data":"cf6da1cc14053e243838cafc2133acdecd33673324cbd50b38e5df77877082ef"} Feb 01 09:13:13 crc kubenswrapper[5127]: I0201 09:13:13.368960 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerID="224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af" exitCode=0 Feb 01 09:13:13 crc kubenswrapper[5127]: I0201 09:13:13.369107 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m95f" event={"ID":"ff5ab748-f68f-4d4d-801c-0079fd85ff74","Type":"ContainerDied","Data":"224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af"} Feb 01 09:13:14 crc kubenswrapper[5127]: I0201 09:13:14.379273 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m95f" event={"ID":"ff5ab748-f68f-4d4d-801c-0079fd85ff74","Type":"ContainerStarted","Data":"bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322"} Feb 01 09:13:15 crc kubenswrapper[5127]: I0201 09:13:15.404074 5127 generic.go:334] "Generic (PLEG): container finished" podID="f36acb8e-4dbb-4655-9911-5c30e71c1287" containerID="0e3c78c572b02a7612f8cc2f0528fe483431a76b953ecb30a2de4b75e8fec972" exitCode=0 Feb 01 09:13:15 crc kubenswrapper[5127]: I0201 09:13:15.404146 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" event={"ID":"f36acb8e-4dbb-4655-9911-5c30e71c1287","Type":"ContainerDied","Data":"0e3c78c572b02a7612f8cc2f0528fe483431a76b953ecb30a2de4b75e8fec972"} Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.052428 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.154106 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-libvirt-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.154204 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-dhcp-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.154276 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ssh-key-openstack-cell1\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155126 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-nova-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155571 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ovn-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155645 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-bootstrap-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155718 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-inventory\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155749 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-metadata-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155824 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-sriov-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155877 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9r79\" (UniqueName: \"kubernetes.io/projected/f36acb8e-4dbb-4655-9911-5c30e71c1287-kube-api-access-k9r79\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155916 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ceph\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.155945 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-telemetry-combined-ca-bundle\") pod \"f36acb8e-4dbb-4655-9911-5c30e71c1287\" (UID: \"f36acb8e-4dbb-4655-9911-5c30e71c1287\") " Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.161738 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.161945 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.162243 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.162349 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.162914 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.164045 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ceph" (OuterVolumeSpecName: "ceph") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.167546 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.167630 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.168673 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.176085 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36acb8e-4dbb-4655-9911-5c30e71c1287-kube-api-access-k9r79" (OuterVolumeSpecName: "kube-api-access-k9r79") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "kube-api-access-k9r79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.199017 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.206234 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-inventory" (OuterVolumeSpecName: "inventory") pod "f36acb8e-4dbb-4655-9911-5c30e71c1287" (UID: "f36acb8e-4dbb-4655-9911-5c30e71c1287"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258694 5127 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258741 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258761 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258778 5127 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258791 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258802 5127 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258815 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258830 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258842 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258855 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9r79\" (UniqueName: \"kubernetes.io/projected/f36acb8e-4dbb-4655-9911-5c30e71c1287-kube-api-access-k9r79\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258867 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.258877 5127 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f36acb8e-4dbb-4655-9911-5c30e71c1287-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.439383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" event={"ID":"f36acb8e-4dbb-4655-9911-5c30e71c1287","Type":"ContainerDied","Data":"7951f36a3d0e3029437e2591db9bd4948ae349eacacae5a7a7f9ec563ea2d32e"} Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.439443 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7951f36a3d0e3029437e2591db9bd4948ae349eacacae5a7a7f9ec563ea2d32e" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.439497 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rccdw" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.552050 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-8jrw4"] Feb 01 09:13:17 crc kubenswrapper[5127]: E0201 09:13:17.552780 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f36acb8e-4dbb-4655-9911-5c30e71c1287" containerName="install-certs-openstack-openstack-cell1" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.552890 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36acb8e-4dbb-4655-9911-5c30e71c1287" containerName="install-certs-openstack-openstack-cell1" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.553230 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="f36acb8e-4dbb-4655-9911-5c30e71c1287" containerName="install-certs-openstack-openstack-cell1" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.554039 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.559896 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.560227 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.563456 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-8jrw4"] Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.666863 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bt58\" (UniqueName: \"kubernetes.io/projected/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-kube-api-access-8bt58\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.667163 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.667239 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-inventory\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.667339 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ceph\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.769828 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.769926 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-inventory\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.769998 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ceph\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.770043 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bt58\" (UniqueName: \"kubernetes.io/projected/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-kube-api-access-8bt58\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.774427 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.775011 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-inventory\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.776698 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ceph\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.791530 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bt58\" (UniqueName: \"kubernetes.io/projected/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-kube-api-access-8bt58\") pod \"ceph-client-openstack-openstack-cell1-8jrw4\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:17 crc kubenswrapper[5127]: I0201 09:13:17.872252 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:18 crc kubenswrapper[5127]: I0201 09:13:18.575276 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-8jrw4"] Feb 01 09:13:19 crc kubenswrapper[5127]: I0201 09:13:19.489127 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerID="bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322" exitCode=0 Feb 01 09:13:19 crc kubenswrapper[5127]: I0201 09:13:19.489573 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m95f" event={"ID":"ff5ab748-f68f-4d4d-801c-0079fd85ff74","Type":"ContainerDied","Data":"bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322"} Feb 01 09:13:19 crc kubenswrapper[5127]: I0201 09:13:19.492981 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" event={"ID":"fe0d7dc2-efc9-47da-b525-c38469d4a8ce","Type":"ContainerStarted","Data":"2bae2ae004026d97c3c0f02f808fc20b01b7f001e721b9dadc8b1f9a0dd501f3"} Feb 01 09:13:19 crc kubenswrapper[5127]: I0201 09:13:19.493051 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" event={"ID":"fe0d7dc2-efc9-47da-b525-c38469d4a8ce","Type":"ContainerStarted","Data":"e28f52c06e2ca6978acf4a4d86815b33a183a4b849263db2b0ff3b86174da985"} Feb 01 09:13:19 crc kubenswrapper[5127]: I0201 09:13:19.537261 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" podStartSLOduration=2.048962828 podStartE2EDuration="2.537220356s" podCreationTimestamp="2026-02-01 09:13:17 +0000 UTC" firstStartedPulling="2026-02-01 09:13:18.588217644 +0000 UTC m=+8749.074120017" lastFinishedPulling="2026-02-01 09:13:19.076475162 +0000 UTC m=+8749.562377545" observedRunningTime="2026-02-01 09:13:19.531152773 +0000 UTC m=+8750.017055136" watchObservedRunningTime="2026-02-01 09:13:19.537220356 +0000 UTC m=+8750.023122719" Feb 01 09:13:20 crc kubenswrapper[5127]: I0201 09:13:20.507213 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m95f" event={"ID":"ff5ab748-f68f-4d4d-801c-0079fd85ff74","Type":"ContainerStarted","Data":"6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426"} Feb 01 09:13:20 crc kubenswrapper[5127]: I0201 09:13:20.545152 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8m95f" podStartSLOduration=2.996487683 podStartE2EDuration="9.545132082s" podCreationTimestamp="2026-02-01 09:13:11 +0000 UTC" firstStartedPulling="2026-02-01 09:13:13.372975838 +0000 UTC m=+8743.858878201" lastFinishedPulling="2026-02-01 09:13:19.921620227 +0000 UTC m=+8750.407522600" observedRunningTime="2026-02-01 09:13:20.532154986 +0000 UTC m=+8751.018057349" watchObservedRunningTime="2026-02-01 09:13:20.545132082 +0000 UTC m=+8751.031034455" Feb 01 09:13:21 crc kubenswrapper[5127]: I0201 09:13:21.785473 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:21 crc kubenswrapper[5127]: I0201 09:13:21.785845 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:22 crc kubenswrapper[5127]: I0201 09:13:22.835427 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8m95f" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="registry-server" probeResult="failure" output=< Feb 01 09:13:22 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:13:22 crc kubenswrapper[5127]: > Feb 01 09:13:25 crc kubenswrapper[5127]: I0201 09:13:25.554036 5127 generic.go:334] "Generic (PLEG): container finished" podID="fe0d7dc2-efc9-47da-b525-c38469d4a8ce" containerID="2bae2ae004026d97c3c0f02f808fc20b01b7f001e721b9dadc8b1f9a0dd501f3" exitCode=0 Feb 01 09:13:25 crc kubenswrapper[5127]: I0201 09:13:25.554137 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" event={"ID":"fe0d7dc2-efc9-47da-b525-c38469d4a8ce","Type":"ContainerDied","Data":"2bae2ae004026d97c3c0f02f808fc20b01b7f001e721b9dadc8b1f9a0dd501f3"} Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.003814 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.081889 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bt58\" (UniqueName: \"kubernetes.io/projected/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-kube-api-access-8bt58\") pod \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.082037 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ssh-key-openstack-cell1\") pod \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.082146 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ceph\") pod \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.082196 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-inventory\") pod \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\" (UID: \"fe0d7dc2-efc9-47da-b525-c38469d4a8ce\") " Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.093907 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-kube-api-access-8bt58" (OuterVolumeSpecName: "kube-api-access-8bt58") pod "fe0d7dc2-efc9-47da-b525-c38469d4a8ce" (UID: "fe0d7dc2-efc9-47da-b525-c38469d4a8ce"). InnerVolumeSpecName "kube-api-access-8bt58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.094764 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ceph" (OuterVolumeSpecName: "ceph") pod "fe0d7dc2-efc9-47da-b525-c38469d4a8ce" (UID: "fe0d7dc2-efc9-47da-b525-c38469d4a8ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.145697 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-inventory" (OuterVolumeSpecName: "inventory") pod "fe0d7dc2-efc9-47da-b525-c38469d4a8ce" (UID: "fe0d7dc2-efc9-47da-b525-c38469d4a8ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.160814 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "fe0d7dc2-efc9-47da-b525-c38469d4a8ce" (UID: "fe0d7dc2-efc9-47da-b525-c38469d4a8ce"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.184328 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.184359 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.184370 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bt58\" (UniqueName: \"kubernetes.io/projected/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-kube-api-access-8bt58\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.184379 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fe0d7dc2-efc9-47da-b525-c38469d4a8ce-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.600167 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" event={"ID":"fe0d7dc2-efc9-47da-b525-c38469d4a8ce","Type":"ContainerDied","Data":"e28f52c06e2ca6978acf4a4d86815b33a183a4b849263db2b0ff3b86174da985"} Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.601919 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28f52c06e2ca6978acf4a4d86815b33a183a4b849263db2b0ff3b86174da985" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.600353 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-8jrw4" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.675371 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b4fz9"] Feb 01 09:13:27 crc kubenswrapper[5127]: E0201 09:13:27.676186 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0d7dc2-efc9-47da-b525-c38469d4a8ce" containerName="ceph-client-openstack-openstack-cell1" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.676218 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0d7dc2-efc9-47da-b525-c38469d4a8ce" containerName="ceph-client-openstack-openstack-cell1" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.676460 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0d7dc2-efc9-47da-b525-c38469d4a8ce" containerName="ceph-client-openstack-openstack-cell1" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.677670 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.681175 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.681483 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.696964 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b4fz9"] Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.712665 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.712714 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.712750 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ceph\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.712790 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-inventory\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.712839 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68mpq\" (UniqueName: \"kubernetes.io/projected/36c41840-5738-4cb4-973b-c8d38371cfcb-kube-api-access-68mpq\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.712906 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c41840-5738-4cb4-973b-c8d38371cfcb-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.813960 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.814038 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ceph\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.814098 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-inventory\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.814172 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68mpq\" (UniqueName: \"kubernetes.io/projected/36c41840-5738-4cb4-973b-c8d38371cfcb-kube-api-access-68mpq\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.814308 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c41840-5738-4cb4-973b-c8d38371cfcb-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.814368 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.815372 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c41840-5738-4cb4-973b-c8d38371cfcb-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.818990 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-inventory\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.819109 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ceph\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.819991 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.830268 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:27 crc kubenswrapper[5127]: I0201 09:13:27.834620 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68mpq\" (UniqueName: \"kubernetes.io/projected/36c41840-5738-4cb4-973b-c8d38371cfcb-kube-api-access-68mpq\") pod \"ovn-openstack-openstack-cell1-b4fz9\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:28 crc kubenswrapper[5127]: I0201 09:13:28.019021 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:13:28 crc kubenswrapper[5127]: I0201 09:13:28.664436 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-b4fz9"] Feb 01 09:13:29 crc kubenswrapper[5127]: I0201 09:13:29.626337 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" event={"ID":"36c41840-5738-4cb4-973b-c8d38371cfcb","Type":"ContainerStarted","Data":"16a172da4882704b5fde7246c0a5bfd3177ee6e72616f404fbe66f2a0cc4e7ea"} Feb 01 09:13:29 crc kubenswrapper[5127]: I0201 09:13:29.626389 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" event={"ID":"36c41840-5738-4cb4-973b-c8d38371cfcb","Type":"ContainerStarted","Data":"ffa19f9c982017953bd875733055ece95f3dc7ff1ac608e779602c0867eec54b"} Feb 01 09:13:29 crc kubenswrapper[5127]: I0201 09:13:29.645231 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" podStartSLOduration=2.2221524759999998 podStartE2EDuration="2.645210501s" podCreationTimestamp="2026-02-01 09:13:27 +0000 UTC" firstStartedPulling="2026-02-01 09:13:28.675162076 +0000 UTC m=+8759.161064439" lastFinishedPulling="2026-02-01 09:13:29.098220101 +0000 UTC m=+8759.584122464" observedRunningTime="2026-02-01 09:13:29.639435686 +0000 UTC m=+8760.125338039" watchObservedRunningTime="2026-02-01 09:13:29.645210501 +0000 UTC m=+8760.131112884" Feb 01 09:13:33 crc kubenswrapper[5127]: I0201 09:13:33.269875 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8m95f" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="registry-server" probeResult="failure" output=< Feb 01 09:13:33 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:13:33 crc kubenswrapper[5127]: > Feb 01 09:13:42 crc kubenswrapper[5127]: I0201 09:13:42.842095 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8m95f" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="registry-server" probeResult="failure" output=< Feb 01 09:13:42 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:13:42 crc kubenswrapper[5127]: > Feb 01 09:13:51 crc kubenswrapper[5127]: I0201 09:13:51.852982 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:51 crc kubenswrapper[5127]: I0201 09:13:51.919915 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:52 crc kubenswrapper[5127]: I0201 09:13:52.096980 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8m95f"] Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.124144 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8m95f" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="registry-server" containerID="cri-o://6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426" gracePeriod=2 Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.695527 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.817170 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-utilities\") pod \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.817246 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6fkj\" (UniqueName: \"kubernetes.io/projected/ff5ab748-f68f-4d4d-801c-0079fd85ff74-kube-api-access-m6fkj\") pod \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.817408 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-catalog-content\") pod \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\" (UID: \"ff5ab748-f68f-4d4d-801c-0079fd85ff74\") " Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.818276 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-utilities" (OuterVolumeSpecName: "utilities") pod "ff5ab748-f68f-4d4d-801c-0079fd85ff74" (UID: "ff5ab748-f68f-4d4d-801c-0079fd85ff74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.822982 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5ab748-f68f-4d4d-801c-0079fd85ff74-kube-api-access-m6fkj" (OuterVolumeSpecName: "kube-api-access-m6fkj") pod "ff5ab748-f68f-4d4d-801c-0079fd85ff74" (UID: "ff5ab748-f68f-4d4d-801c-0079fd85ff74"). InnerVolumeSpecName "kube-api-access-m6fkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.920429 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.920478 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6fkj\" (UniqueName: \"kubernetes.io/projected/ff5ab748-f68f-4d4d-801c-0079fd85ff74-kube-api-access-m6fkj\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:53 crc kubenswrapper[5127]: I0201 09:13:53.925162 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff5ab748-f68f-4d4d-801c-0079fd85ff74" (UID: "ff5ab748-f68f-4d4d-801c-0079fd85ff74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.025625 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff5ab748-f68f-4d4d-801c-0079fd85ff74-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.135794 5127 generic.go:334] "Generic (PLEG): container finished" podID="c0c7f749-f26b-40b4-bbc1-38446be4a68d" containerID="83b121c5b35bf84ba22ef088235c51d1a65ab6d62140972cc25455f810abb099" exitCode=0 Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.135865 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-dkmwg" event={"ID":"c0c7f749-f26b-40b4-bbc1-38446be4a68d","Type":"ContainerDied","Data":"83b121c5b35bf84ba22ef088235c51d1a65ab6d62140972cc25455f810abb099"} Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.139261 5127 generic.go:334] "Generic (PLEG): container finished" podID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerID="6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426" exitCode=0 Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.139308 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m95f" event={"ID":"ff5ab748-f68f-4d4d-801c-0079fd85ff74","Type":"ContainerDied","Data":"6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426"} Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.139350 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8m95f" event={"ID":"ff5ab748-f68f-4d4d-801c-0079fd85ff74","Type":"ContainerDied","Data":"cf6da1cc14053e243838cafc2133acdecd33673324cbd50b38e5df77877082ef"} Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.139382 5127 scope.go:117] "RemoveContainer" containerID="6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.139404 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8m95f" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.168152 5127 scope.go:117] "RemoveContainer" containerID="bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.197564 5127 scope.go:117] "RemoveContainer" containerID="224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.204893 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8m95f"] Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.216735 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8m95f"] Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.249168 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" path="/var/lib/kubelet/pods/ff5ab748-f68f-4d4d-801c-0079fd85ff74/volumes" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.250421 5127 scope.go:117] "RemoveContainer" containerID="6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426" Feb 01 09:13:54 crc kubenswrapper[5127]: E0201 09:13:54.252247 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426\": container with ID starting with 6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426 not found: ID does not exist" containerID="6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.252318 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426"} err="failed to get container status \"6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426\": rpc error: code = NotFound desc = could not find container \"6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426\": container with ID starting with 6650738e3478cd4e24f2e8507c1a3fe63915a9479e3170135a526371b69b3426 not found: ID does not exist" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.252375 5127 scope.go:117] "RemoveContainer" containerID="bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322" Feb 01 09:13:54 crc kubenswrapper[5127]: E0201 09:13:54.252826 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322\": container with ID starting with bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322 not found: ID does not exist" containerID="bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.252882 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322"} err="failed to get container status \"bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322\": rpc error: code = NotFound desc = could not find container \"bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322\": container with ID starting with bd9fb948442f7f07c38edaa5299b9d8259f4185c4f067fe38f1ab6fc702f5322 not found: ID does not exist" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.252911 5127 scope.go:117] "RemoveContainer" containerID="224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af" Feb 01 09:13:54 crc kubenswrapper[5127]: E0201 09:13:54.253261 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af\": container with ID starting with 224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af not found: ID does not exist" containerID="224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af" Feb 01 09:13:54 crc kubenswrapper[5127]: I0201 09:13:54.253309 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af"} err="failed to get container status \"224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af\": rpc error: code = NotFound desc = could not find container \"224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af\": container with ID starting with 224226c6b2ecc4641f59acf96af33003887269aba42bcf15d841cea10be434af not found: ID does not exist" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.676334 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.796040 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-inventory\") pod \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.796134 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ssh-key-openstack-networker\") pod \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.796234 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q2mr\" (UniqueName: \"kubernetes.io/projected/c0c7f749-f26b-40b4-bbc1-38446be4a68d-kube-api-access-4q2mr\") pod \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.796370 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovn-combined-ca-bundle\") pod \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.796449 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovncontroller-config-0\") pod \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\" (UID: \"c0c7f749-f26b-40b4-bbc1-38446be4a68d\") " Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.803766 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c7f749-f26b-40b4-bbc1-38446be4a68d-kube-api-access-4q2mr" (OuterVolumeSpecName: "kube-api-access-4q2mr") pod "c0c7f749-f26b-40b4-bbc1-38446be4a68d" (UID: "c0c7f749-f26b-40b4-bbc1-38446be4a68d"). InnerVolumeSpecName "kube-api-access-4q2mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.803826 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c0c7f749-f26b-40b4-bbc1-38446be4a68d" (UID: "c0c7f749-f26b-40b4-bbc1-38446be4a68d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.834174 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "c0c7f749-f26b-40b4-bbc1-38446be4a68d" (UID: "c0c7f749-f26b-40b4-bbc1-38446be4a68d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.835366 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "c0c7f749-f26b-40b4-bbc1-38446be4a68d" (UID: "c0c7f749-f26b-40b4-bbc1-38446be4a68d"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.838232 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-inventory" (OuterVolumeSpecName: "inventory") pod "c0c7f749-f26b-40b4-bbc1-38446be4a68d" (UID: "c0c7f749-f26b-40b4-bbc1-38446be4a68d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.898940 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.898977 5127 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.898989 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.898999 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/c0c7f749-f26b-40b4-bbc1-38446be4a68d-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:55 crc kubenswrapper[5127]: I0201 09:13:55.899009 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q2mr\" (UniqueName: \"kubernetes.io/projected/c0c7f749-f26b-40b4-bbc1-38446be4a68d-kube-api-access-4q2mr\") on node \"crc\" DevicePath \"\"" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.181064 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-dkmwg" event={"ID":"c0c7f749-f26b-40b4-bbc1-38446be4a68d","Type":"ContainerDied","Data":"bdf9393e2386257c51bf4c37f2d22ecb1d2d13b889d96b294526c102b71d322b"} Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.181616 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf9393e2386257c51bf4c37f2d22ecb1d2d13b889d96b294526c102b71d322b" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.181139 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-dkmwg" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.282776 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-s9g4w"] Feb 01 09:13:56 crc kubenswrapper[5127]: E0201 09:13:56.283393 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="extract-utilities" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.283414 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="extract-utilities" Feb 01 09:13:56 crc kubenswrapper[5127]: E0201 09:13:56.283449 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="registry-server" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.283458 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="registry-server" Feb 01 09:13:56 crc kubenswrapper[5127]: E0201 09:13:56.283480 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c7f749-f26b-40b4-bbc1-38446be4a68d" containerName="ovn-openstack-openstack-networker" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.283489 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c7f749-f26b-40b4-bbc1-38446be4a68d" containerName="ovn-openstack-openstack-networker" Feb 01 09:13:56 crc kubenswrapper[5127]: E0201 09:13:56.283511 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="extract-content" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.283520 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="extract-content" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.283778 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c7f749-f26b-40b4-bbc1-38446be4a68d" containerName="ovn-openstack-openstack-networker" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.283807 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5ab748-f68f-4d4d-801c-0079fd85ff74" containerName="registry-server" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.284752 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.288128 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.288440 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.288853 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-7hhgq" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.292964 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.305030 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-s9g4w"] Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.408287 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.408387 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.408421 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.408522 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-inventory\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.408558 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjm8g\" (UniqueName: \"kubernetes.io/projected/56a60225-4a4f-47ae-bb41-a4510a34a915-kube-api-access-mjm8g\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.408620 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.509784 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-inventory\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.509842 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjm8g\" (UniqueName: \"kubernetes.io/projected/56a60225-4a4f-47ae-bb41-a4510a34a915-kube-api-access-mjm8g\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.509884 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.509924 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.509983 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.510009 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.514145 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-inventory\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.514930 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.516278 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.517375 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.527648 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.528183 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjm8g\" (UniqueName: \"kubernetes.io/projected/56a60225-4a4f-47ae-bb41-a4510a34a915-kube-api-access-mjm8g\") pod \"neutron-metadata-openstack-openstack-networker-s9g4w\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:56 crc kubenswrapper[5127]: I0201 09:13:56.613895 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:13:57 crc kubenswrapper[5127]: I0201 09:13:57.220380 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-s9g4w"] Feb 01 09:13:58 crc kubenswrapper[5127]: I0201 09:13:58.231401 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" event={"ID":"56a60225-4a4f-47ae-bb41-a4510a34a915","Type":"ContainerStarted","Data":"73b2fcaa288d1c493c10cff000c66c33ee2de069a52aa45452403a3be8e69d13"} Feb 01 09:13:58 crc kubenswrapper[5127]: I0201 09:13:58.231675 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" event={"ID":"56a60225-4a4f-47ae-bb41-a4510a34a915","Type":"ContainerStarted","Data":"64cc93e050e1460f987602025fc2bfad7307a33025670c3de8256800eda50856"} Feb 01 09:13:58 crc kubenswrapper[5127]: I0201 09:13:58.260907 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" podStartSLOduration=1.85381424 podStartE2EDuration="2.260878828s" podCreationTimestamp="2026-02-01 09:13:56 +0000 UTC" firstStartedPulling="2026-02-01 09:13:57.209382714 +0000 UTC m=+8787.695285097" lastFinishedPulling="2026-02-01 09:13:57.616447312 +0000 UTC m=+8788.102349685" observedRunningTime="2026-02-01 09:13:58.253998993 +0000 UTC m=+8788.739901386" watchObservedRunningTime="2026-02-01 09:13:58.260878828 +0000 UTC m=+8788.746781231" Feb 01 09:14:36 crc kubenswrapper[5127]: I0201 09:14:36.740974 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:14:36 crc kubenswrapper[5127]: I0201 09:14:36.743439 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:14:43 crc kubenswrapper[5127]: I0201 09:14:43.302547 5127 generic.go:334] "Generic (PLEG): container finished" podID="36c41840-5738-4cb4-973b-c8d38371cfcb" containerID="16a172da4882704b5fde7246c0a5bfd3177ee6e72616f404fbe66f2a0cc4e7ea" exitCode=0 Feb 01 09:14:43 crc kubenswrapper[5127]: I0201 09:14:43.303084 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" event={"ID":"36c41840-5738-4cb4-973b-c8d38371cfcb","Type":"ContainerDied","Data":"16a172da4882704b5fde7246c0a5bfd3177ee6e72616f404fbe66f2a0cc4e7ea"} Feb 01 09:14:44 crc kubenswrapper[5127]: I0201 09:14:44.829786 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:14:44 crc kubenswrapper[5127]: I0201 09:14:44.891392 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-inventory\") pod \"36c41840-5738-4cb4-973b-c8d38371cfcb\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " Feb 01 09:14:44 crc kubenswrapper[5127]: I0201 09:14:44.891899 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ceph\") pod \"36c41840-5738-4cb4-973b-c8d38371cfcb\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " Feb 01 09:14:44 crc kubenswrapper[5127]: I0201 09:14:44.891982 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68mpq\" (UniqueName: \"kubernetes.io/projected/36c41840-5738-4cb4-973b-c8d38371cfcb-kube-api-access-68mpq\") pod \"36c41840-5738-4cb4-973b-c8d38371cfcb\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " Feb 01 09:14:44 crc kubenswrapper[5127]: I0201 09:14:44.892869 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ssh-key-openstack-cell1\") pod \"36c41840-5738-4cb4-973b-c8d38371cfcb\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " Feb 01 09:14:44 crc kubenswrapper[5127]: I0201 09:14:44.892979 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ovn-combined-ca-bundle\") pod \"36c41840-5738-4cb4-973b-c8d38371cfcb\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " Feb 01 09:14:44 crc kubenswrapper[5127]: I0201 09:14:44.893052 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c41840-5738-4cb4-973b-c8d38371cfcb-ovncontroller-config-0\") pod \"36c41840-5738-4cb4-973b-c8d38371cfcb\" (UID: \"36c41840-5738-4cb4-973b-c8d38371cfcb\") " Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.323599 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" event={"ID":"36c41840-5738-4cb4-973b-c8d38371cfcb","Type":"ContainerDied","Data":"ffa19f9c982017953bd875733055ece95f3dc7ff1ac608e779602c0867eec54b"} Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.323655 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-b4fz9" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.323670 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa19f9c982017953bd875733055ece95f3dc7ff1ac608e779602c0867eec54b" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.448875 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-gdkvb"] Feb 01 09:14:45 crc kubenswrapper[5127]: E0201 09:14:45.449454 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c41840-5738-4cb4-973b-c8d38371cfcb" containerName="ovn-openstack-openstack-cell1" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.449514 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c41840-5738-4cb4-973b-c8d38371cfcb" containerName="ovn-openstack-openstack-cell1" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.449884 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c41840-5738-4cb4-973b-c8d38371cfcb" containerName="ovn-openstack-openstack-cell1" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.450622 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.472553 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-gdkvb"] Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.508865 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509065 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509148 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509311 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509359 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509495 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509529 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzwpd\" (UniqueName: \"kubernetes.io/projected/9032a10b-4966-4895-a50f-e0d4682049e9-kube-api-access-zzwpd\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509956 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ceph" (OuterVolumeSpecName: "ceph") pod "36c41840-5738-4cb4-973b-c8d38371cfcb" (UID: "36c41840-5738-4cb4-973b-c8d38371cfcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.509985 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "36c41840-5738-4cb4-973b-c8d38371cfcb" (UID: "36c41840-5738-4cb4-973b-c8d38371cfcb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.510057 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c41840-5738-4cb4-973b-c8d38371cfcb-kube-api-access-68mpq" (OuterVolumeSpecName: "kube-api-access-68mpq") pod "36c41840-5738-4cb4-973b-c8d38371cfcb" (UID: "36c41840-5738-4cb4-973b-c8d38371cfcb"). InnerVolumeSpecName "kube-api-access-68mpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.539463 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "36c41840-5738-4cb4-973b-c8d38371cfcb" (UID: "36c41840-5738-4cb4-973b-c8d38371cfcb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.611910 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-inventory" (OuterVolumeSpecName: "inventory") pod "36c41840-5738-4cb4-973b-c8d38371cfcb" (UID: "36c41840-5738-4cb4-973b-c8d38371cfcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.612090 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.612402 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.612680 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.612743 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzwpd\" (UniqueName: \"kubernetes.io/projected/9032a10b-4966-4895-a50f-e0d4682049e9-kube-api-access-zzwpd\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613020 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613203 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613286 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613399 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613420 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613437 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613449 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36c41840-5738-4cb4-973b-c8d38371cfcb-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.613459 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68mpq\" (UniqueName: \"kubernetes.io/projected/36c41840-5738-4cb4-973b-c8d38371cfcb-kube-api-access-68mpq\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.617190 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.617529 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.617692 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.617781 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.621670 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.631139 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzwpd\" (UniqueName: \"kubernetes.io/projected/9032a10b-4966-4895-a50f-e0d4682049e9-kube-api-access-zzwpd\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.681007 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-gdkvb\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.706069 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36c41840-5738-4cb4-973b-c8d38371cfcb-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "36c41840-5738-4cb4-973b-c8d38371cfcb" (UID: "36c41840-5738-4cb4-973b-c8d38371cfcb"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.715751 5127 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/36c41840-5738-4cb4-973b-c8d38371cfcb-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:45 crc kubenswrapper[5127]: I0201 09:14:45.801531 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:14:46 crc kubenswrapper[5127]: I0201 09:14:46.466345 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-gdkvb"] Feb 01 09:14:47 crc kubenswrapper[5127]: I0201 09:14:47.341027 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" event={"ID":"9032a10b-4966-4895-a50f-e0d4682049e9","Type":"ContainerStarted","Data":"46fb46c8ccb78b5be7198e8eb3a23eeb572c904897eee2265b863def18df1e59"} Feb 01 09:14:48 crc kubenswrapper[5127]: I0201 09:14:48.351421 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" event={"ID":"9032a10b-4966-4895-a50f-e0d4682049e9","Type":"ContainerStarted","Data":"a244a7257eacbd8ddc33521e95d4500d4f2fd24296cc2690385ffc265eb29bd4"} Feb 01 09:14:48 crc kubenswrapper[5127]: I0201 09:14:48.370191 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" podStartSLOduration=2.721357924 podStartE2EDuration="3.370168876s" podCreationTimestamp="2026-02-01 09:14:45 +0000 UTC" firstStartedPulling="2026-02-01 09:14:46.47876143 +0000 UTC m=+8836.964663793" lastFinishedPulling="2026-02-01 09:14:47.127572382 +0000 UTC m=+8837.613474745" observedRunningTime="2026-02-01 09:14:48.366890009 +0000 UTC m=+8838.852792392" watchObservedRunningTime="2026-02-01 09:14:48.370168876 +0000 UTC m=+8838.856071249" Feb 01 09:14:55 crc kubenswrapper[5127]: I0201 09:14:55.454948 5127 generic.go:334] "Generic (PLEG): container finished" podID="56a60225-4a4f-47ae-bb41-a4510a34a915" containerID="73b2fcaa288d1c493c10cff000c66c33ee2de069a52aa45452403a3be8e69d13" exitCode=0 Feb 01 09:14:55 crc kubenswrapper[5127]: I0201 09:14:55.455059 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" event={"ID":"56a60225-4a4f-47ae-bb41-a4510a34a915","Type":"ContainerDied","Data":"73b2fcaa288d1c493c10cff000c66c33ee2de069a52aa45452403a3be8e69d13"} Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.011117 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.091449 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-inventory\") pod \"56a60225-4a4f-47ae-bb41-a4510a34a915\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.091574 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-metadata-combined-ca-bundle\") pod \"56a60225-4a4f-47ae-bb41-a4510a34a915\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.091704 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-nova-metadata-neutron-config-0\") pod \"56a60225-4a4f-47ae-bb41-a4510a34a915\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.091839 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-ssh-key-openstack-networker\") pod \"56a60225-4a4f-47ae-bb41-a4510a34a915\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.091882 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjm8g\" (UniqueName: \"kubernetes.io/projected/56a60225-4a4f-47ae-bb41-a4510a34a915-kube-api-access-mjm8g\") pod \"56a60225-4a4f-47ae-bb41-a4510a34a915\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.092077 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-ovn-metadata-agent-neutron-config-0\") pod \"56a60225-4a4f-47ae-bb41-a4510a34a915\" (UID: \"56a60225-4a4f-47ae-bb41-a4510a34a915\") " Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.099229 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "56a60225-4a4f-47ae-bb41-a4510a34a915" (UID: "56a60225-4a4f-47ae-bb41-a4510a34a915"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.104094 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a60225-4a4f-47ae-bb41-a4510a34a915-kube-api-access-mjm8g" (OuterVolumeSpecName: "kube-api-access-mjm8g") pod "56a60225-4a4f-47ae-bb41-a4510a34a915" (UID: "56a60225-4a4f-47ae-bb41-a4510a34a915"). InnerVolumeSpecName "kube-api-access-mjm8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.124164 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "56a60225-4a4f-47ae-bb41-a4510a34a915" (UID: "56a60225-4a4f-47ae-bb41-a4510a34a915"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.126507 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "56a60225-4a4f-47ae-bb41-a4510a34a915" (UID: "56a60225-4a4f-47ae-bb41-a4510a34a915"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.137017 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-inventory" (OuterVolumeSpecName: "inventory") pod "56a60225-4a4f-47ae-bb41-a4510a34a915" (UID: "56a60225-4a4f-47ae-bb41-a4510a34a915"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.148624 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "56a60225-4a4f-47ae-bb41-a4510a34a915" (UID: "56a60225-4a4f-47ae-bb41-a4510a34a915"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.195784 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.195830 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjm8g\" (UniqueName: \"kubernetes.io/projected/56a60225-4a4f-47ae-bb41-a4510a34a915-kube-api-access-mjm8g\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.195847 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.195862 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.195874 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.195887 5127 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/56a60225-4a4f-47ae-bb41-a4510a34a915-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.474654 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" event={"ID":"56a60225-4a4f-47ae-bb41-a4510a34a915","Type":"ContainerDied","Data":"64cc93e050e1460f987602025fc2bfad7307a33025670c3de8256800eda50856"} Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.474687 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-s9g4w" Feb 01 09:14:57 crc kubenswrapper[5127]: I0201 09:14:57.474717 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64cc93e050e1460f987602025fc2bfad7307a33025670c3de8256800eda50856" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.150050 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw"] Feb 01 09:15:00 crc kubenswrapper[5127]: E0201 09:15:00.151162 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a60225-4a4f-47ae-bb41-a4510a34a915" containerName="neutron-metadata-openstack-openstack-networker" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.151182 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a60225-4a4f-47ae-bb41-a4510a34a915" containerName="neutron-metadata-openstack-openstack-networker" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.151481 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a60225-4a4f-47ae-bb41-a4510a34a915" containerName="neutron-metadata-openstack-openstack-networker" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.152668 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.154941 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.156080 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.166809 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw"] Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.292617 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02c9ea12-9b4a-4c88-b0a1-810b48999166-secret-volume\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.292677 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjsp\" (UniqueName: \"kubernetes.io/projected/02c9ea12-9b4a-4c88-b0a1-810b48999166-kube-api-access-kjjsp\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.292843 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02c9ea12-9b4a-4c88-b0a1-810b48999166-config-volume\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.394443 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02c9ea12-9b4a-4c88-b0a1-810b48999166-secret-volume\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.394522 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjsp\" (UniqueName: \"kubernetes.io/projected/02c9ea12-9b4a-4c88-b0a1-810b48999166-kube-api-access-kjjsp\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.394775 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02c9ea12-9b4a-4c88-b0a1-810b48999166-config-volume\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.397193 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02c9ea12-9b4a-4c88-b0a1-810b48999166-config-volume\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.419064 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjsp\" (UniqueName: \"kubernetes.io/projected/02c9ea12-9b4a-4c88-b0a1-810b48999166-kube-api-access-kjjsp\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.419108 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02c9ea12-9b4a-4c88-b0a1-810b48999166-secret-volume\") pod \"collect-profiles-29498955-g8qbw\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.486677 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:00 crc kubenswrapper[5127]: I0201 09:15:00.950240 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw"] Feb 01 09:15:00 crc kubenswrapper[5127]: W0201 09:15:00.953835 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c9ea12_9b4a_4c88_b0a1_810b48999166.slice/crio-00464d5c7ea43a148db457b2c519560da3b8bbe166095a7534f1be441722fe8c WatchSource:0}: Error finding container 00464d5c7ea43a148db457b2c519560da3b8bbe166095a7534f1be441722fe8c: Status 404 returned error can't find the container with id 00464d5c7ea43a148db457b2c519560da3b8bbe166095a7534f1be441722fe8c Feb 01 09:15:01 crc kubenswrapper[5127]: I0201 09:15:01.525957 5127 generic.go:334] "Generic (PLEG): container finished" podID="02c9ea12-9b4a-4c88-b0a1-810b48999166" containerID="50e30ed260e77de219c101046127e2ac6a35c03f8e019ed982eaf2f1599714c8" exitCode=0 Feb 01 09:15:01 crc kubenswrapper[5127]: I0201 09:15:01.526275 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" event={"ID":"02c9ea12-9b4a-4c88-b0a1-810b48999166","Type":"ContainerDied","Data":"50e30ed260e77de219c101046127e2ac6a35c03f8e019ed982eaf2f1599714c8"} Feb 01 09:15:01 crc kubenswrapper[5127]: I0201 09:15:01.526501 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" event={"ID":"02c9ea12-9b4a-4c88-b0a1-810b48999166","Type":"ContainerStarted","Data":"00464d5c7ea43a148db457b2c519560da3b8bbe166095a7534f1be441722fe8c"} Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.029659 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.179556 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02c9ea12-9b4a-4c88-b0a1-810b48999166-secret-volume\") pod \"02c9ea12-9b4a-4c88-b0a1-810b48999166\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.179685 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02c9ea12-9b4a-4c88-b0a1-810b48999166-config-volume\") pod \"02c9ea12-9b4a-4c88-b0a1-810b48999166\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.179739 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjjsp\" (UniqueName: \"kubernetes.io/projected/02c9ea12-9b4a-4c88-b0a1-810b48999166-kube-api-access-kjjsp\") pod \"02c9ea12-9b4a-4c88-b0a1-810b48999166\" (UID: \"02c9ea12-9b4a-4c88-b0a1-810b48999166\") " Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.182172 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02c9ea12-9b4a-4c88-b0a1-810b48999166-config-volume" (OuterVolumeSpecName: "config-volume") pod "02c9ea12-9b4a-4c88-b0a1-810b48999166" (UID: "02c9ea12-9b4a-4c88-b0a1-810b48999166"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.187096 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c9ea12-9b4a-4c88-b0a1-810b48999166-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "02c9ea12-9b4a-4c88-b0a1-810b48999166" (UID: "02c9ea12-9b4a-4c88-b0a1-810b48999166"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.187634 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c9ea12-9b4a-4c88-b0a1-810b48999166-kube-api-access-kjjsp" (OuterVolumeSpecName: "kube-api-access-kjjsp") pod "02c9ea12-9b4a-4c88-b0a1-810b48999166" (UID: "02c9ea12-9b4a-4c88-b0a1-810b48999166"). InnerVolumeSpecName "kube-api-access-kjjsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.282432 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02c9ea12-9b4a-4c88-b0a1-810b48999166-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.282478 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02c9ea12-9b4a-4c88-b0a1-810b48999166-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.282492 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjjsp\" (UniqueName: \"kubernetes.io/projected/02c9ea12-9b4a-4c88-b0a1-810b48999166-kube-api-access-kjjsp\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.569868 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" event={"ID":"02c9ea12-9b4a-4c88-b0a1-810b48999166","Type":"ContainerDied","Data":"00464d5c7ea43a148db457b2c519560da3b8bbe166095a7534f1be441722fe8c"} Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.569929 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00464d5c7ea43a148db457b2c519560da3b8bbe166095a7534f1be441722fe8c" Feb 01 09:15:03 crc kubenswrapper[5127]: I0201 09:15:03.570246 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw" Feb 01 09:15:04 crc kubenswrapper[5127]: I0201 09:15:04.133844 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k"] Feb 01 09:15:04 crc kubenswrapper[5127]: I0201 09:15:04.145708 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498910-fz78k"] Feb 01 09:15:04 crc kubenswrapper[5127]: I0201 09:15:04.261609 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afc22c56-417d-47ce-92f5-72f5b9e3d2fb" path="/var/lib/kubelet/pods/afc22c56-417d-47ce-92f5-72f5b9e3d2fb/volumes" Feb 01 09:15:06 crc kubenswrapper[5127]: I0201 09:15:06.741224 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:15:06 crc kubenswrapper[5127]: I0201 09:15:06.741522 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.740396 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.740857 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.740900 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.741625 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.741670 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" gracePeriod=600 Feb 01 09:15:36 crc kubenswrapper[5127]: E0201 09:15:36.877979 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.982237 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" exitCode=0 Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.982286 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b"} Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.982326 5127 scope.go:117] "RemoveContainer" containerID="c485e70a9873e3942903d5d8141fd5764a60f96cd54d7e2f63e9fc092f3df951" Feb 01 09:15:36 crc kubenswrapper[5127]: I0201 09:15:36.982854 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:15:36 crc kubenswrapper[5127]: E0201 09:15:36.999017 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:15:45 crc kubenswrapper[5127]: I0201 09:15:45.077216 5127 generic.go:334] "Generic (PLEG): container finished" podID="9032a10b-4966-4895-a50f-e0d4682049e9" containerID="a244a7257eacbd8ddc33521e95d4500d4f2fd24296cc2690385ffc265eb29bd4" exitCode=0 Feb 01 09:15:45 crc kubenswrapper[5127]: I0201 09:15:45.077281 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" event={"ID":"9032a10b-4966-4895-a50f-e0d4682049e9","Type":"ContainerDied","Data":"a244a7257eacbd8ddc33521e95d4500d4f2fd24296cc2690385ffc265eb29bd4"} Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.632497 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.710895 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-nova-metadata-neutron-config-0\") pod \"9032a10b-4966-4895-a50f-e0d4682049e9\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.710988 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-metadata-combined-ca-bundle\") pod \"9032a10b-4966-4895-a50f-e0d4682049e9\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.711069 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ssh-key-openstack-cell1\") pod \"9032a10b-4966-4895-a50f-e0d4682049e9\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.711186 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ceph\") pod \"9032a10b-4966-4895-a50f-e0d4682049e9\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.711264 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-inventory\") pod \"9032a10b-4966-4895-a50f-e0d4682049e9\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.711306 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9032a10b-4966-4895-a50f-e0d4682049e9\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.711440 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzwpd\" (UniqueName: \"kubernetes.io/projected/9032a10b-4966-4895-a50f-e0d4682049e9-kube-api-access-zzwpd\") pod \"9032a10b-4966-4895-a50f-e0d4682049e9\" (UID: \"9032a10b-4966-4895-a50f-e0d4682049e9\") " Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.718848 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ceph" (OuterVolumeSpecName: "ceph") pod "9032a10b-4966-4895-a50f-e0d4682049e9" (UID: "9032a10b-4966-4895-a50f-e0d4682049e9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.721482 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9032a10b-4966-4895-a50f-e0d4682049e9" (UID: "9032a10b-4966-4895-a50f-e0d4682049e9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.723884 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9032a10b-4966-4895-a50f-e0d4682049e9-kube-api-access-zzwpd" (OuterVolumeSpecName: "kube-api-access-zzwpd") pod "9032a10b-4966-4895-a50f-e0d4682049e9" (UID: "9032a10b-4966-4895-a50f-e0d4682049e9"). InnerVolumeSpecName "kube-api-access-zzwpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.746026 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-inventory" (OuterVolumeSpecName: "inventory") pod "9032a10b-4966-4895-a50f-e0d4682049e9" (UID: "9032a10b-4966-4895-a50f-e0d4682049e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.752111 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9032a10b-4966-4895-a50f-e0d4682049e9" (UID: "9032a10b-4966-4895-a50f-e0d4682049e9"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.757404 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9032a10b-4966-4895-a50f-e0d4682049e9" (UID: "9032a10b-4966-4895-a50f-e0d4682049e9"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.761809 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9032a10b-4966-4895-a50f-e0d4682049e9" (UID: "9032a10b-4966-4895-a50f-e0d4682049e9"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.814938 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.814996 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.815015 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.815035 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.815059 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzwpd\" (UniqueName: \"kubernetes.io/projected/9032a10b-4966-4895-a50f-e0d4682049e9-kube-api-access-zzwpd\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.815078 5127 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:46 crc kubenswrapper[5127]: I0201 09:15:46.815097 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9032a10b-4966-4895-a50f-e0d4682049e9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.101304 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" event={"ID":"9032a10b-4966-4895-a50f-e0d4682049e9","Type":"ContainerDied","Data":"46fb46c8ccb78b5be7198e8eb3a23eeb572c904897eee2265b863def18df1e59"} Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.101364 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46fb46c8ccb78b5be7198e8eb3a23eeb572c904897eee2265b863def18df1e59" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.101426 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-gdkvb" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.244866 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9ng7k"] Feb 01 09:15:47 crc kubenswrapper[5127]: E0201 09:15:47.245616 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c9ea12-9b4a-4c88-b0a1-810b48999166" containerName="collect-profiles" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.245641 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c9ea12-9b4a-4c88-b0a1-810b48999166" containerName="collect-profiles" Feb 01 09:15:47 crc kubenswrapper[5127]: E0201 09:15:47.245680 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9032a10b-4966-4895-a50f-e0d4682049e9" containerName="neutron-metadata-openstack-openstack-cell1" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.245692 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9032a10b-4966-4895-a50f-e0d4682049e9" containerName="neutron-metadata-openstack-openstack-cell1" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.246010 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9032a10b-4966-4895-a50f-e0d4682049e9" containerName="neutron-metadata-openstack-openstack-cell1" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.246040 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c9ea12-9b4a-4c88-b0a1-810b48999166" containerName="collect-profiles" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.251223 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.255894 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.256076 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.256352 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.258685 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.265035 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.280681 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9ng7k"] Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.325684 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7bk\" (UniqueName: \"kubernetes.io/projected/0a0a2c40-4f67-4b10-b267-5981c37d8253-kube-api-access-zv7bk\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.325889 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-inventory\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.326002 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.326124 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.326157 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.326181 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ceph\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.428846 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.429450 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.429484 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ceph\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.429516 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7bk\" (UniqueName: \"kubernetes.io/projected/0a0a2c40-4f67-4b10-b267-5981c37d8253-kube-api-access-zv7bk\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.429657 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-inventory\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.429786 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.437766 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ceph\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.437788 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-inventory\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.439039 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.446954 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.448937 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.455188 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7bk\" (UniqueName: \"kubernetes.io/projected/0a0a2c40-4f67-4b10-b267-5981c37d8253-kube-api-access-zv7bk\") pod \"libvirt-openstack-openstack-cell1-9ng7k\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:47 crc kubenswrapper[5127]: I0201 09:15:47.582809 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:15:48 crc kubenswrapper[5127]: I0201 09:15:48.218353 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9ng7k"] Feb 01 09:15:49 crc kubenswrapper[5127]: I0201 09:15:49.125866 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" event={"ID":"0a0a2c40-4f67-4b10-b267-5981c37d8253","Type":"ContainerStarted","Data":"55db86cd33807b26927d5d820037ddaaa473dcddb3bdb2d2ed5b6a16f57e1324"} Feb 01 09:15:49 crc kubenswrapper[5127]: I0201 09:15:49.127069 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" event={"ID":"0a0a2c40-4f67-4b10-b267-5981c37d8253","Type":"ContainerStarted","Data":"14ac3866c375117d3ca78c38370a508142694cfa9fdf9d6414073231e1be4591"} Feb 01 09:15:49 crc kubenswrapper[5127]: I0201 09:15:49.128897 5127 scope.go:117] "RemoveContainer" containerID="d5b8075211897ba9c810a285fde9d758dcf29f5e819abd0685e33ddb0e4a6bd0" Feb 01 09:15:49 crc kubenswrapper[5127]: I0201 09:15:49.154693 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" podStartSLOduration=1.7063169120000001 podStartE2EDuration="2.154673785s" podCreationTimestamp="2026-02-01 09:15:47 +0000 UTC" firstStartedPulling="2026-02-01 09:15:48.225795041 +0000 UTC m=+8898.711697404" lastFinishedPulling="2026-02-01 09:15:48.674151884 +0000 UTC m=+8899.160054277" observedRunningTime="2026-02-01 09:15:49.151678575 +0000 UTC m=+8899.637580978" watchObservedRunningTime="2026-02-01 09:15:49.154673785 +0000 UTC m=+8899.640576148" Feb 01 09:15:49 crc kubenswrapper[5127]: I0201 09:15:49.236411 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:15:49 crc kubenswrapper[5127]: E0201 09:15:49.236851 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:16:03 crc kubenswrapper[5127]: I0201 09:16:03.236041 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:16:03 crc kubenswrapper[5127]: E0201 09:16:03.236821 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:16:14 crc kubenswrapper[5127]: I0201 09:16:14.235982 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:16:14 crc kubenswrapper[5127]: E0201 09:16:14.237144 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:16:28 crc kubenswrapper[5127]: I0201 09:16:28.236375 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:16:28 crc kubenswrapper[5127]: E0201 09:16:28.237193 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:16:41 crc kubenswrapper[5127]: I0201 09:16:41.235651 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:16:41 crc kubenswrapper[5127]: E0201 09:16:41.236337 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:16:53 crc kubenswrapper[5127]: I0201 09:16:53.236443 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:16:53 crc kubenswrapper[5127]: E0201 09:16:53.237741 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:17:05 crc kubenswrapper[5127]: I0201 09:17:05.235704 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:17:05 crc kubenswrapper[5127]: E0201 09:17:05.241331 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:17:19 crc kubenswrapper[5127]: I0201 09:17:19.236987 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:17:19 crc kubenswrapper[5127]: E0201 09:17:19.238349 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:17:30 crc kubenswrapper[5127]: I0201 09:17:30.242407 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:17:30 crc kubenswrapper[5127]: E0201 09:17:30.243683 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:17:45 crc kubenswrapper[5127]: I0201 09:17:45.236285 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:17:45 crc kubenswrapper[5127]: E0201 09:17:45.237550 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:17:58 crc kubenswrapper[5127]: I0201 09:17:58.235395 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:17:58 crc kubenswrapper[5127]: E0201 09:17:58.237476 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:18:09 crc kubenswrapper[5127]: I0201 09:18:09.236335 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:18:09 crc kubenswrapper[5127]: E0201 09:18:09.237734 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:18:21 crc kubenswrapper[5127]: I0201 09:18:21.236022 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:18:21 crc kubenswrapper[5127]: E0201 09:18:21.237282 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:18:33 crc kubenswrapper[5127]: I0201 09:18:33.235713 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:18:33 crc kubenswrapper[5127]: E0201 09:18:33.236606 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:18:44 crc kubenswrapper[5127]: I0201 09:18:44.236192 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:18:44 crc kubenswrapper[5127]: E0201 09:18:44.237509 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:18:55 crc kubenswrapper[5127]: I0201 09:18:55.237077 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:18:55 crc kubenswrapper[5127]: E0201 09:18:55.241897 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:19:10 crc kubenswrapper[5127]: I0201 09:19:10.248285 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:19:10 crc kubenswrapper[5127]: E0201 09:19:10.249004 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.712829 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g8l92"] Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.715556 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.722235 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8l92"] Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.807422 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-utilities\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.807544 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826p8\" (UniqueName: \"kubernetes.io/projected/e11228c9-a567-4607-815d-ecbbdbab687b-kube-api-access-826p8\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.807721 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-catalog-content\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.909052 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-catalog-content\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.909460 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-utilities\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.909558 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-826p8\" (UniqueName: \"kubernetes.io/projected/e11228c9-a567-4607-815d-ecbbdbab687b-kube-api-access-826p8\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.910097 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-utilities\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.910431 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-catalog-content\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:14 crc kubenswrapper[5127]: I0201 09:19:14.927637 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-826p8\" (UniqueName: \"kubernetes.io/projected/e11228c9-a567-4607-815d-ecbbdbab687b-kube-api-access-826p8\") pod \"community-operators-g8l92\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:15 crc kubenswrapper[5127]: I0201 09:19:15.073841 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:15 crc kubenswrapper[5127]: I0201 09:19:15.611930 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8l92"] Feb 01 09:19:15 crc kubenswrapper[5127]: I0201 09:19:15.673188 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8l92" event={"ID":"e11228c9-a567-4607-815d-ecbbdbab687b","Type":"ContainerStarted","Data":"95047d6cbe8c80bfbd88d8b66ff6f5cc5d3b8b2b37a979732582426359d5ff23"} Feb 01 09:19:16 crc kubenswrapper[5127]: I0201 09:19:16.690298 5127 generic.go:334] "Generic (PLEG): container finished" podID="e11228c9-a567-4607-815d-ecbbdbab687b" containerID="8b835f0e436d1f8a7aa7cb7853992106b922ef5f5a0dd051db1988d8cace3a4c" exitCode=0 Feb 01 09:19:16 crc kubenswrapper[5127]: I0201 09:19:16.690439 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8l92" event={"ID":"e11228c9-a567-4607-815d-ecbbdbab687b","Type":"ContainerDied","Data":"8b835f0e436d1f8a7aa7cb7853992106b922ef5f5a0dd051db1988d8cace3a4c"} Feb 01 09:19:16 crc kubenswrapper[5127]: I0201 09:19:16.694162 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:19:17 crc kubenswrapper[5127]: I0201 09:19:17.705120 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8l92" event={"ID":"e11228c9-a567-4607-815d-ecbbdbab687b","Type":"ContainerStarted","Data":"4feaf590308c9cd1a0f40f2a8f143e4cdf1fe3e69a5b48d534bb02f775fd85aa"} Feb 01 09:19:19 crc kubenswrapper[5127]: I0201 09:19:19.731360 5127 generic.go:334] "Generic (PLEG): container finished" podID="e11228c9-a567-4607-815d-ecbbdbab687b" containerID="4feaf590308c9cd1a0f40f2a8f143e4cdf1fe3e69a5b48d534bb02f775fd85aa" exitCode=0 Feb 01 09:19:19 crc kubenswrapper[5127]: I0201 09:19:19.731435 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8l92" event={"ID":"e11228c9-a567-4607-815d-ecbbdbab687b","Type":"ContainerDied","Data":"4feaf590308c9cd1a0f40f2a8f143e4cdf1fe3e69a5b48d534bb02f775fd85aa"} Feb 01 09:19:20 crc kubenswrapper[5127]: I0201 09:19:20.743036 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8l92" event={"ID":"e11228c9-a567-4607-815d-ecbbdbab687b","Type":"ContainerStarted","Data":"2a2296763f1f0fcc5d94642f1b92cd1674b835882cb60c488499fa93873c90f2"} Feb 01 09:19:22 crc kubenswrapper[5127]: I0201 09:19:22.236175 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:19:22 crc kubenswrapper[5127]: E0201 09:19:22.236728 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:19:25 crc kubenswrapper[5127]: I0201 09:19:25.073946 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:25 crc kubenswrapper[5127]: I0201 09:19:25.074359 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:25 crc kubenswrapper[5127]: I0201 09:19:25.135762 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:25 crc kubenswrapper[5127]: I0201 09:19:25.163056 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g8l92" podStartSLOduration=7.659945924 podStartE2EDuration="11.163034129s" podCreationTimestamp="2026-02-01 09:19:14 +0000 UTC" firstStartedPulling="2026-02-01 09:19:16.693683601 +0000 UTC m=+9107.179585994" lastFinishedPulling="2026-02-01 09:19:20.196771826 +0000 UTC m=+9110.682674199" observedRunningTime="2026-02-01 09:19:20.764236897 +0000 UTC m=+9111.250139260" watchObservedRunningTime="2026-02-01 09:19:25.163034129 +0000 UTC m=+9115.648936502" Feb 01 09:19:25 crc kubenswrapper[5127]: I0201 09:19:25.871073 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:25 crc kubenswrapper[5127]: I0201 09:19:25.949298 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8l92"] Feb 01 09:19:27 crc kubenswrapper[5127]: I0201 09:19:27.796659 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-58vrq"] Feb 01 09:19:27 crc kubenswrapper[5127]: I0201 09:19:27.799686 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:27 crc kubenswrapper[5127]: I0201 09:19:27.822756 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58vrq"] Feb 01 09:19:27 crc kubenswrapper[5127]: I0201 09:19:27.874820 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g8l92" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="registry-server" containerID="cri-o://2a2296763f1f0fcc5d94642f1b92cd1674b835882cb60c488499fa93873c90f2" gracePeriod=2 Feb 01 09:19:27 crc kubenswrapper[5127]: I0201 09:19:27.916107 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-utilities\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:27 crc kubenswrapper[5127]: I0201 09:19:27.916155 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-catalog-content\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:27 crc kubenswrapper[5127]: I0201 09:19:27.916544 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfzq\" (UniqueName: \"kubernetes.io/projected/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-kube-api-access-gcfzq\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.018541 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfzq\" (UniqueName: \"kubernetes.io/projected/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-kube-api-access-gcfzq\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.018753 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-utilities\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.018776 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-catalog-content\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.019314 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-catalog-content\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.019365 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-utilities\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.042081 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfzq\" (UniqueName: \"kubernetes.io/projected/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-kube-api-access-gcfzq\") pod \"redhat-marketplace-58vrq\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.128532 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.689240 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58vrq"] Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.900968 5127 generic.go:334] "Generic (PLEG): container finished" podID="e11228c9-a567-4607-815d-ecbbdbab687b" containerID="2a2296763f1f0fcc5d94642f1b92cd1674b835882cb60c488499fa93873c90f2" exitCode=0 Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.901106 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8l92" event={"ID":"e11228c9-a567-4607-815d-ecbbdbab687b","Type":"ContainerDied","Data":"2a2296763f1f0fcc5d94642f1b92cd1674b835882cb60c488499fa93873c90f2"} Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.903890 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerStarted","Data":"f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef"} Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.903937 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerStarted","Data":"c747a895377d40a9c8547dd677307a980b128c11a37243559c20e2914f018c10"} Feb 01 09:19:28 crc kubenswrapper[5127]: I0201 09:19:28.957365 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.038517 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-utilities\") pod \"e11228c9-a567-4607-815d-ecbbdbab687b\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.038693 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-826p8\" (UniqueName: \"kubernetes.io/projected/e11228c9-a567-4607-815d-ecbbdbab687b-kube-api-access-826p8\") pod \"e11228c9-a567-4607-815d-ecbbdbab687b\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.038915 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-catalog-content\") pod \"e11228c9-a567-4607-815d-ecbbdbab687b\" (UID: \"e11228c9-a567-4607-815d-ecbbdbab687b\") " Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.039556 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-utilities" (OuterVolumeSpecName: "utilities") pod "e11228c9-a567-4607-815d-ecbbdbab687b" (UID: "e11228c9-a567-4607-815d-ecbbdbab687b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.046871 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e11228c9-a567-4607-815d-ecbbdbab687b-kube-api-access-826p8" (OuterVolumeSpecName: "kube-api-access-826p8") pod "e11228c9-a567-4607-815d-ecbbdbab687b" (UID: "e11228c9-a567-4607-815d-ecbbdbab687b"). InnerVolumeSpecName "kube-api-access-826p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.093980 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e11228c9-a567-4607-815d-ecbbdbab687b" (UID: "e11228c9-a567-4607-815d-ecbbdbab687b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.141017 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-826p8\" (UniqueName: \"kubernetes.io/projected/e11228c9-a567-4607-815d-ecbbdbab687b-kube-api-access-826p8\") on node \"crc\" DevicePath \"\"" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.141248 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.141324 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e11228c9-a567-4607-815d-ecbbdbab687b-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.918891 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8l92" event={"ID":"e11228c9-a567-4607-815d-ecbbdbab687b","Type":"ContainerDied","Data":"95047d6cbe8c80bfbd88d8b66ff6f5cc5d3b8b2b37a979732582426359d5ff23"} Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.919283 5127 scope.go:117] "RemoveContainer" containerID="2a2296763f1f0fcc5d94642f1b92cd1674b835882cb60c488499fa93873c90f2" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.918903 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8l92" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.922014 5127 generic.go:334] "Generic (PLEG): container finished" podID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerID="f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef" exitCode=0 Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.922077 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerDied","Data":"f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef"} Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.922117 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerStarted","Data":"ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd"} Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.952654 5127 scope.go:117] "RemoveContainer" containerID="4feaf590308c9cd1a0f40f2a8f143e4cdf1fe3e69a5b48d534bb02f775fd85aa" Feb 01 09:19:29 crc kubenswrapper[5127]: I0201 09:19:29.992988 5127 scope.go:117] "RemoveContainer" containerID="8b835f0e436d1f8a7aa7cb7853992106b922ef5f5a0dd051db1988d8cace3a4c" Feb 01 09:19:30 crc kubenswrapper[5127]: I0201 09:19:29.998623 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8l92"] Feb 01 09:19:30 crc kubenswrapper[5127]: I0201 09:19:30.007428 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g8l92"] Feb 01 09:19:30 crc kubenswrapper[5127]: I0201 09:19:30.250948 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" path="/var/lib/kubelet/pods/e11228c9-a567-4607-815d-ecbbdbab687b/volumes" Feb 01 09:19:30 crc kubenswrapper[5127]: I0201 09:19:30.943567 5127 generic.go:334] "Generic (PLEG): container finished" podID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerID="ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd" exitCode=0 Feb 01 09:19:30 crc kubenswrapper[5127]: I0201 09:19:30.943640 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerDied","Data":"ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd"} Feb 01 09:19:31 crc kubenswrapper[5127]: I0201 09:19:31.959088 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerStarted","Data":"5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2"} Feb 01 09:19:31 crc kubenswrapper[5127]: I0201 09:19:31.982318 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-58vrq" podStartSLOduration=2.543402327 podStartE2EDuration="4.982290491s" podCreationTimestamp="2026-02-01 09:19:27 +0000 UTC" firstStartedPulling="2026-02-01 09:19:28.906272715 +0000 UTC m=+9119.392175098" lastFinishedPulling="2026-02-01 09:19:31.345160899 +0000 UTC m=+9121.831063262" observedRunningTime="2026-02-01 09:19:31.975769076 +0000 UTC m=+9122.461671439" watchObservedRunningTime="2026-02-01 09:19:31.982290491 +0000 UTC m=+9122.468192894" Feb 01 09:19:34 crc kubenswrapper[5127]: I0201 09:19:34.235995 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:19:34 crc kubenswrapper[5127]: E0201 09:19:34.236678 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:19:38 crc kubenswrapper[5127]: I0201 09:19:38.129387 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:38 crc kubenswrapper[5127]: I0201 09:19:38.130104 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:38 crc kubenswrapper[5127]: I0201 09:19:38.209737 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:39 crc kubenswrapper[5127]: I0201 09:19:39.117030 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:39 crc kubenswrapper[5127]: I0201 09:19:39.172899 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-58vrq"] Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.067738 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-58vrq" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="registry-server" containerID="cri-o://5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2" gracePeriod=2 Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.614778 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.724299 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-catalog-content\") pod \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.725200 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcfzq\" (UniqueName: \"kubernetes.io/projected/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-kube-api-access-gcfzq\") pod \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.725619 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-utilities\") pod \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\" (UID: \"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1\") " Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.726640 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-utilities" (OuterVolumeSpecName: "utilities") pod "9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" (UID: "9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.726928 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.739687 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-kube-api-access-gcfzq" (OuterVolumeSpecName: "kube-api-access-gcfzq") pod "9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" (UID: "9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1"). InnerVolumeSpecName "kube-api-access-gcfzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.755361 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" (UID: "9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.828781 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:19:41 crc kubenswrapper[5127]: I0201 09:19:41.828831 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcfzq\" (UniqueName: \"kubernetes.io/projected/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1-kube-api-access-gcfzq\") on node \"crc\" DevicePath \"\"" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.085426 5127 generic.go:334] "Generic (PLEG): container finished" podID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerID="5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2" exitCode=0 Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.085516 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerDied","Data":"5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2"} Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.085575 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58vrq" event={"ID":"9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1","Type":"ContainerDied","Data":"c747a895377d40a9c8547dd677307a980b128c11a37243559c20e2914f018c10"} Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.085614 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58vrq" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.085642 5127 scope.go:117] "RemoveContainer" containerID="5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.148882 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-58vrq"] Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.152053 5127 scope.go:117] "RemoveContainer" containerID="ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.160603 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-58vrq"] Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.197129 5127 scope.go:117] "RemoveContainer" containerID="f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.241467 5127 scope.go:117] "RemoveContainer" containerID="5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2" Feb 01 09:19:42 crc kubenswrapper[5127]: E0201 09:19:42.241870 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2\": container with ID starting with 5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2 not found: ID does not exist" containerID="5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.241897 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2"} err="failed to get container status \"5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2\": rpc error: code = NotFound desc = could not find container \"5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2\": container with ID starting with 5c28fd7b9d99c65f4923987285ed39ad4ff62daf24af511e0919f514a43536d2 not found: ID does not exist" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.241915 5127 scope.go:117] "RemoveContainer" containerID="ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd" Feb 01 09:19:42 crc kubenswrapper[5127]: E0201 09:19:42.242527 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd\": container with ID starting with ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd not found: ID does not exist" containerID="ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.242550 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd"} err="failed to get container status \"ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd\": rpc error: code = NotFound desc = could not find container \"ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd\": container with ID starting with ebd52e7265d5f5734da6d132ed1e9ef583e968280af93de8e11131c3113d9acd not found: ID does not exist" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.242564 5127 scope.go:117] "RemoveContainer" containerID="f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef" Feb 01 09:19:42 crc kubenswrapper[5127]: E0201 09:19:42.242808 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef\": container with ID starting with f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef not found: ID does not exist" containerID="f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.242829 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef"} err="failed to get container status \"f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef\": rpc error: code = NotFound desc = could not find container \"f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef\": container with ID starting with f121d2f9b161408dc83816ebd6f4ab3c9138b6c28a105ed951e6226f5b8babef not found: ID does not exist" Feb 01 09:19:42 crc kubenswrapper[5127]: I0201 09:19:42.249823 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" path="/var/lib/kubelet/pods/9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1/volumes" Feb 01 09:19:48 crc kubenswrapper[5127]: I0201 09:19:48.236947 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:19:48 crc kubenswrapper[5127]: E0201 09:19:48.238210 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:20:00 crc kubenswrapper[5127]: I0201 09:20:00.250233 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:20:00 crc kubenswrapper[5127]: E0201 09:20:00.251718 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:20:12 crc kubenswrapper[5127]: I0201 09:20:12.236453 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:20:12 crc kubenswrapper[5127]: E0201 09:20:12.237655 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:20:24 crc kubenswrapper[5127]: I0201 09:20:24.237002 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:20:24 crc kubenswrapper[5127]: E0201 09:20:24.238292 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:20:37 crc kubenswrapper[5127]: I0201 09:20:37.235774 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:20:37 crc kubenswrapper[5127]: I0201 09:20:37.778458 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"5ec3c5052e71917ccc99909614655642e6621d22fd6ecb3067b15ba57f4e19ae"} Feb 01 09:20:41 crc kubenswrapper[5127]: I0201 09:20:41.823982 5127 generic.go:334] "Generic (PLEG): container finished" podID="0a0a2c40-4f67-4b10-b267-5981c37d8253" containerID="55db86cd33807b26927d5d820037ddaaa473dcddb3bdb2d2ed5b6a16f57e1324" exitCode=0 Feb 01 09:20:41 crc kubenswrapper[5127]: I0201 09:20:41.824257 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" event={"ID":"0a0a2c40-4f67-4b10-b267-5981c37d8253","Type":"ContainerDied","Data":"55db86cd33807b26927d5d820037ddaaa473dcddb3bdb2d2ed5b6a16f57e1324"} Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.310475 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.400291 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-combined-ca-bundle\") pod \"0a0a2c40-4f67-4b10-b267-5981c37d8253\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.400434 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-inventory\") pod \"0a0a2c40-4f67-4b10-b267-5981c37d8253\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.400522 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-secret-0\") pod \"0a0a2c40-4f67-4b10-b267-5981c37d8253\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.400847 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ceph\") pod \"0a0a2c40-4f67-4b10-b267-5981c37d8253\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.401009 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ssh-key-openstack-cell1\") pod \"0a0a2c40-4f67-4b10-b267-5981c37d8253\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.401461 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv7bk\" (UniqueName: \"kubernetes.io/projected/0a0a2c40-4f67-4b10-b267-5981c37d8253-kube-api-access-zv7bk\") pod \"0a0a2c40-4f67-4b10-b267-5981c37d8253\" (UID: \"0a0a2c40-4f67-4b10-b267-5981c37d8253\") " Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.407295 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ceph" (OuterVolumeSpecName: "ceph") pod "0a0a2c40-4f67-4b10-b267-5981c37d8253" (UID: "0a0a2c40-4f67-4b10-b267-5981c37d8253"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.408323 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0a2c40-4f67-4b10-b267-5981c37d8253-kube-api-access-zv7bk" (OuterVolumeSpecName: "kube-api-access-zv7bk") pod "0a0a2c40-4f67-4b10-b267-5981c37d8253" (UID: "0a0a2c40-4f67-4b10-b267-5981c37d8253"). InnerVolumeSpecName "kube-api-access-zv7bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.408499 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0a0a2c40-4f67-4b10-b267-5981c37d8253" (UID: "0a0a2c40-4f67-4b10-b267-5981c37d8253"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.435873 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0a0a2c40-4f67-4b10-b267-5981c37d8253" (UID: "0a0a2c40-4f67-4b10-b267-5981c37d8253"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.438503 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-inventory" (OuterVolumeSpecName: "inventory") pod "0a0a2c40-4f67-4b10-b267-5981c37d8253" (UID: "0a0a2c40-4f67-4b10-b267-5981c37d8253"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.444773 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0a0a2c40-4f67-4b10-b267-5981c37d8253" (UID: "0a0a2c40-4f67-4b10-b267-5981c37d8253"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.504223 5127 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.504274 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.504287 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.504300 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv7bk\" (UniqueName: \"kubernetes.io/projected/0a0a2c40-4f67-4b10-b267-5981c37d8253-kube-api-access-zv7bk\") on node \"crc\" DevicePath \"\"" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.504309 5127 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.504319 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a0a2c40-4f67-4b10-b267-5981c37d8253-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.863323 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" event={"ID":"0a0a2c40-4f67-4b10-b267-5981c37d8253","Type":"ContainerDied","Data":"14ac3866c375117d3ca78c38370a508142694cfa9fdf9d6414073231e1be4591"} Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.863401 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14ac3866c375117d3ca78c38370a508142694cfa9fdf9d6414073231e1be4591" Feb 01 09:20:43 crc kubenswrapper[5127]: I0201 09:20:43.863541 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9ng7k" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.063106 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-6ct2s"] Feb 01 09:20:44 crc kubenswrapper[5127]: E0201 09:20:44.063928 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="registry-server" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.063948 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="registry-server" Feb 01 09:20:44 crc kubenswrapper[5127]: E0201 09:20:44.063969 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="registry-server" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.063976 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="registry-server" Feb 01 09:20:44 crc kubenswrapper[5127]: E0201 09:20:44.064000 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="extract-content" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064007 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="extract-content" Feb 01 09:20:44 crc kubenswrapper[5127]: E0201 09:20:44.064015 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="extract-utilities" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064021 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="extract-utilities" Feb 01 09:20:44 crc kubenswrapper[5127]: E0201 09:20:44.064031 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="extract-utilities" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064037 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="extract-utilities" Feb 01 09:20:44 crc kubenswrapper[5127]: E0201 09:20:44.064048 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="extract-content" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064053 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="extract-content" Feb 01 09:20:44 crc kubenswrapper[5127]: E0201 09:20:44.064067 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0a2c40-4f67-4b10-b267-5981c37d8253" containerName="libvirt-openstack-openstack-cell1" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064072 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0a2c40-4f67-4b10-b267-5981c37d8253" containerName="libvirt-openstack-openstack-cell1" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064249 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e11228c9-a567-4607-815d-ecbbdbab687b" containerName="registry-server" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064260 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7a3877-ecba-4496-8aa8-7b5db0c6c5e1" containerName="registry-server" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.064277 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0a2c40-4f67-4b10-b267-5981c37d8253" containerName="libvirt-openstack-openstack-cell1" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.065026 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.068129 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.068306 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.068700 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.068694 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.072348 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.072854 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.077951 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.099966 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-6ct2s"] Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.221955 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4v8j\" (UniqueName: \"kubernetes.io/projected/30b9dbac-2336-4eab-8221-09ef2c34d3a7-kube-api-access-r4v8j\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222051 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ceph\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222104 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222136 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222159 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222203 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222274 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222357 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222430 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222494 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.222515 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326058 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ceph\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326148 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326214 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326252 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326311 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326408 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326490 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326751 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326784 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.326876 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4v8j\" (UniqueName: \"kubernetes.io/projected/30b9dbac-2336-4eab-8221-09ef2c34d3a7-kube-api-access-r4v8j\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.327209 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.327209 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.332594 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.332953 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ceph\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.333138 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.333668 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.334188 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.334243 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.336720 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-inventory\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.342213 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.357423 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4v8j\" (UniqueName: \"kubernetes.io/projected/30b9dbac-2336-4eab-8221-09ef2c34d3a7-kube-api-access-r4v8j\") pod \"nova-cell1-openstack-openstack-cell1-6ct2s\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:44 crc kubenswrapper[5127]: I0201 09:20:44.403516 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:20:45 crc kubenswrapper[5127]: I0201 09:20:44.999364 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-6ct2s"] Feb 01 09:20:45 crc kubenswrapper[5127]: I0201 09:20:45.884152 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" event={"ID":"30b9dbac-2336-4eab-8221-09ef2c34d3a7","Type":"ContainerStarted","Data":"fe490e32c9e8abbae3a9290c7518492f217b28d7bfee4d712de03535c36a65f9"} Feb 01 09:20:45 crc kubenswrapper[5127]: I0201 09:20:45.884967 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" event={"ID":"30b9dbac-2336-4eab-8221-09ef2c34d3a7","Type":"ContainerStarted","Data":"ac98bd60d79733231ae139dc784c0df5cd391be2af1cdc9daf38c74f9ebeca3d"} Feb 01 09:20:45 crc kubenswrapper[5127]: I0201 09:20:45.915213 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" podStartSLOduration=1.413281778 podStartE2EDuration="1.91519605s" podCreationTimestamp="2026-02-01 09:20:44 +0000 UTC" firstStartedPulling="2026-02-01 09:20:45.004489579 +0000 UTC m=+9195.490391982" lastFinishedPulling="2026-02-01 09:20:45.506403861 +0000 UTC m=+9195.992306254" observedRunningTime="2026-02-01 09:20:45.90699653 +0000 UTC m=+9196.392898903" watchObservedRunningTime="2026-02-01 09:20:45.91519605 +0000 UTC m=+9196.401098413" Feb 01 09:23:06 crc kubenswrapper[5127]: I0201 09:23:06.741178 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:23:06 crc kubenswrapper[5127]: I0201 09:23:06.742010 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:23:36 crc kubenswrapper[5127]: I0201 09:23:36.741382 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:23:36 crc kubenswrapper[5127]: I0201 09:23:36.742228 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:23:59 crc kubenswrapper[5127]: I0201 09:23:59.463785 5127 generic.go:334] "Generic (PLEG): container finished" podID="30b9dbac-2336-4eab-8221-09ef2c34d3a7" containerID="fe490e32c9e8abbae3a9290c7518492f217b28d7bfee4d712de03535c36a65f9" exitCode=0 Feb 01 09:23:59 crc kubenswrapper[5127]: I0201 09:23:59.463924 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" event={"ID":"30b9dbac-2336-4eab-8221-09ef2c34d3a7","Type":"ContainerDied","Data":"fe490e32c9e8abbae3a9290c7518492f217b28d7bfee4d712de03535c36a65f9"} Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.164856 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246044 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4v8j\" (UniqueName: \"kubernetes.io/projected/30b9dbac-2336-4eab-8221-09ef2c34d3a7-kube-api-access-r4v8j\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246324 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-1\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246413 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-0\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246493 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-0\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246537 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ssh-key-openstack-cell1\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246605 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ceph\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246679 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-inventory\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246717 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-1\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246799 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-combined-ca-bundle\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246835 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-0\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.246875 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-1\") pod \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\" (UID: \"30b9dbac-2336-4eab-8221-09ef2c34d3a7\") " Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.254286 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.255504 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ceph" (OuterVolumeSpecName: "ceph") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.255856 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b9dbac-2336-4eab-8221-09ef2c34d3a7-kube-api-access-r4v8j" (OuterVolumeSpecName: "kube-api-access-r4v8j") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "kube-api-access-r4v8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.282516 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.284368 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.301733 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.304677 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.309002 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.310592 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.311221 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.315024 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-inventory" (OuterVolumeSpecName: "inventory") pod "30b9dbac-2336-4eab-8221-09ef2c34d3a7" (UID: "30b9dbac-2336-4eab-8221-09ef2c34d3a7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349576 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349648 5127 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349660 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349672 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349686 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349699 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349711 5127 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349723 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349735 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349746 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/30b9dbac-2336-4eab-8221-09ef2c34d3a7-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.349758 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4v8j\" (UniqueName: \"kubernetes.io/projected/30b9dbac-2336-4eab-8221-09ef2c34d3a7-kube-api-access-r4v8j\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.497675 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" event={"ID":"30b9dbac-2336-4eab-8221-09ef2c34d3a7","Type":"ContainerDied","Data":"ac98bd60d79733231ae139dc784c0df5cd391be2af1cdc9daf38c74f9ebeca3d"} Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.498715 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac98bd60d79733231ae139dc784c0df5cd391be2af1cdc9daf38c74f9ebeca3d" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.497777 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-6ct2s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.660821 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-qdr6s"] Feb 01 09:24:01 crc kubenswrapper[5127]: E0201 09:24:01.661709 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30b9dbac-2336-4eab-8221-09ef2c34d3a7" containerName="nova-cell1-openstack-openstack-cell1" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.661738 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b9dbac-2336-4eab-8221-09ef2c34d3a7" containerName="nova-cell1-openstack-openstack-cell1" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.662043 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="30b9dbac-2336-4eab-8221-09ef2c34d3a7" containerName="nova-cell1-openstack-openstack-cell1" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.663149 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.666591 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.667096 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.667512 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.667660 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.669013 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.695442 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-qdr6s"] Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759381 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759481 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759543 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759613 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2z4p\" (UniqueName: \"kubernetes.io/projected/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-kube-api-access-z2z4p\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759757 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceph\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759827 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759874 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.759960 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-inventory\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862152 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceph\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862276 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862339 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-inventory\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862533 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862619 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862703 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.862750 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2z4p\" (UniqueName: \"kubernetes.io/projected/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-kube-api-access-z2z4p\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.868281 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-inventory\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.868862 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.870209 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.870989 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceph\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.871436 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.873476 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.873800 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.891093 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2z4p\" (UniqueName: \"kubernetes.io/projected/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-kube-api-access-z2z4p\") pod \"telemetry-openstack-openstack-cell1-qdr6s\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:01 crc kubenswrapper[5127]: I0201 09:24:01.996632 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:24:02 crc kubenswrapper[5127]: I0201 09:24:02.739249 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-qdr6s"] Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.034638 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tx9d9"] Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.037722 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.043641 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tx9d9"] Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.104906 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-catalog-content\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.104962 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-utilities\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.105011 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crk6x\" (UniqueName: \"kubernetes.io/projected/300ec1c3-2f4a-424b-9479-40fdd2f505bd-kube-api-access-crk6x\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.217810 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-catalog-content\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.217915 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-utilities\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.218026 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crk6x\" (UniqueName: \"kubernetes.io/projected/300ec1c3-2f4a-424b-9479-40fdd2f505bd-kube-api-access-crk6x\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.218984 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-catalog-content\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.219216 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-utilities\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.237812 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crk6x\" (UniqueName: \"kubernetes.io/projected/300ec1c3-2f4a-424b-9479-40fdd2f505bd-kube-api-access-crk6x\") pod \"redhat-operators-tx9d9\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.379145 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.536875 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" event={"ID":"31d637d0-729b-42fe-8cbc-1ed2c449c2b3","Type":"ContainerStarted","Data":"5b3e434d7a56c640f2be4108dea719d7111644485b0589747070bf883320bd3e"} Feb 01 09:24:03 crc kubenswrapper[5127]: I0201 09:24:03.900896 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tx9d9"] Feb 01 09:24:03 crc kubenswrapper[5127]: W0201 09:24:03.910431 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod300ec1c3_2f4a_424b_9479_40fdd2f505bd.slice/crio-c7c24f561338a22e135d1fd4d220a78c4acca850b9f22cc2ce482ccaa69ad053 WatchSource:0}: Error finding container c7c24f561338a22e135d1fd4d220a78c4acca850b9f22cc2ce482ccaa69ad053: Status 404 returned error can't find the container with id c7c24f561338a22e135d1fd4d220a78c4acca850b9f22cc2ce482ccaa69ad053 Feb 01 09:24:04 crc kubenswrapper[5127]: I0201 09:24:04.546840 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" event={"ID":"31d637d0-729b-42fe-8cbc-1ed2c449c2b3","Type":"ContainerStarted","Data":"84f6e42236d1255385f5b790afdb50b40990e26295e5da618283977dfc99bf49"} Feb 01 09:24:04 crc kubenswrapper[5127]: I0201 09:24:04.548556 5127 generic.go:334] "Generic (PLEG): container finished" podID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerID="9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43" exitCode=0 Feb 01 09:24:04 crc kubenswrapper[5127]: I0201 09:24:04.548583 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx9d9" event={"ID":"300ec1c3-2f4a-424b-9479-40fdd2f505bd","Type":"ContainerDied","Data":"9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43"} Feb 01 09:24:04 crc kubenswrapper[5127]: I0201 09:24:04.548611 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx9d9" event={"ID":"300ec1c3-2f4a-424b-9479-40fdd2f505bd","Type":"ContainerStarted","Data":"c7c24f561338a22e135d1fd4d220a78c4acca850b9f22cc2ce482ccaa69ad053"} Feb 01 09:24:04 crc kubenswrapper[5127]: I0201 09:24:04.572786 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" podStartSLOduration=3.099345875 podStartE2EDuration="3.57277148s" podCreationTimestamp="2026-02-01 09:24:01 +0000 UTC" firstStartedPulling="2026-02-01 09:24:02.72376991 +0000 UTC m=+9393.209672283" lastFinishedPulling="2026-02-01 09:24:03.197195525 +0000 UTC m=+9393.683097888" observedRunningTime="2026-02-01 09:24:04.56945258 +0000 UTC m=+9395.055354943" watchObservedRunningTime="2026-02-01 09:24:04.57277148 +0000 UTC m=+9395.058673853" Feb 01 09:24:05 crc kubenswrapper[5127]: I0201 09:24:05.560191 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx9d9" event={"ID":"300ec1c3-2f4a-424b-9479-40fdd2f505bd","Type":"ContainerStarted","Data":"17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724"} Feb 01 09:24:06 crc kubenswrapper[5127]: I0201 09:24:06.741204 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:24:06 crc kubenswrapper[5127]: I0201 09:24:06.741518 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:24:06 crc kubenswrapper[5127]: I0201 09:24:06.741574 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:24:06 crc kubenswrapper[5127]: I0201 09:24:06.742657 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ec3c5052e71917ccc99909614655642e6621d22fd6ecb3067b15ba57f4e19ae"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:24:06 crc kubenswrapper[5127]: I0201 09:24:06.742720 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://5ec3c5052e71917ccc99909614655642e6621d22fd6ecb3067b15ba57f4e19ae" gracePeriod=600 Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.557362 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q9fjk"] Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.559895 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.573359 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9fjk"] Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.597509 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="5ec3c5052e71917ccc99909614655642e6621d22fd6ecb3067b15ba57f4e19ae" exitCode=0 Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.597557 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"5ec3c5052e71917ccc99909614655642e6621d22fd6ecb3067b15ba57f4e19ae"} Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.597604 5127 scope.go:117] "RemoveContainer" containerID="5730a913a957f376f979fa38e7ec7f982e8fe65c7091ebaca46ec4f491d3332b" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.614997 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-catalog-content\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.615072 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-utilities\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.615159 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vkwp\" (UniqueName: \"kubernetes.io/projected/09e86629-5c38-4d04-84ce-284c8ff57e70-kube-api-access-8vkwp\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.716974 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-catalog-content\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.717478 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-catalog-content\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.717577 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-utilities\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.717731 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vkwp\" (UniqueName: \"kubernetes.io/projected/09e86629-5c38-4d04-84ce-284c8ff57e70-kube-api-access-8vkwp\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.717805 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-utilities\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.747212 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vkwp\" (UniqueName: \"kubernetes.io/projected/09e86629-5c38-4d04-84ce-284c8ff57e70-kube-api-access-8vkwp\") pod \"certified-operators-q9fjk\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:07 crc kubenswrapper[5127]: I0201 09:24:07.885102 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:08 crc kubenswrapper[5127]: I0201 09:24:08.485134 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q9fjk"] Feb 01 09:24:08 crc kubenswrapper[5127]: I0201 09:24:08.608527 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19"} Feb 01 09:24:09 crc kubenswrapper[5127]: I0201 09:24:09.619408 5127 generic.go:334] "Generic (PLEG): container finished" podID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerID="26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6" exitCode=0 Feb 01 09:24:09 crc kubenswrapper[5127]: I0201 09:24:09.619472 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9fjk" event={"ID":"09e86629-5c38-4d04-84ce-284c8ff57e70","Type":"ContainerDied","Data":"26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6"} Feb 01 09:24:09 crc kubenswrapper[5127]: I0201 09:24:09.619928 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9fjk" event={"ID":"09e86629-5c38-4d04-84ce-284c8ff57e70","Type":"ContainerStarted","Data":"8f522a9f1b5f53f1bfa44aabdb16dd5217acfaa8d503c8917dbeffe41c6359d9"} Feb 01 09:24:11 crc kubenswrapper[5127]: I0201 09:24:11.653144 5127 generic.go:334] "Generic (PLEG): container finished" podID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerID="17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724" exitCode=0 Feb 01 09:24:11 crc kubenswrapper[5127]: I0201 09:24:11.653331 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx9d9" event={"ID":"300ec1c3-2f4a-424b-9479-40fdd2f505bd","Type":"ContainerDied","Data":"17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724"} Feb 01 09:24:11 crc kubenswrapper[5127]: I0201 09:24:11.662172 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9fjk" event={"ID":"09e86629-5c38-4d04-84ce-284c8ff57e70","Type":"ContainerStarted","Data":"b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58"} Feb 01 09:24:12 crc kubenswrapper[5127]: I0201 09:24:12.685375 5127 generic.go:334] "Generic (PLEG): container finished" podID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerID="b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58" exitCode=0 Feb 01 09:24:12 crc kubenswrapper[5127]: I0201 09:24:12.685472 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9fjk" event={"ID":"09e86629-5c38-4d04-84ce-284c8ff57e70","Type":"ContainerDied","Data":"b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58"} Feb 01 09:24:12 crc kubenswrapper[5127]: I0201 09:24:12.690419 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx9d9" event={"ID":"300ec1c3-2f4a-424b-9479-40fdd2f505bd","Type":"ContainerStarted","Data":"68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0"} Feb 01 09:24:12 crc kubenswrapper[5127]: I0201 09:24:12.726034 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tx9d9" podStartSLOduration=3.213920441 podStartE2EDuration="10.726013729s" podCreationTimestamp="2026-02-01 09:24:02 +0000 UTC" firstStartedPulling="2026-02-01 09:24:04.55002123 +0000 UTC m=+9395.035923593" lastFinishedPulling="2026-02-01 09:24:12.062114518 +0000 UTC m=+9402.548016881" observedRunningTime="2026-02-01 09:24:12.722247668 +0000 UTC m=+9403.208150071" watchObservedRunningTime="2026-02-01 09:24:12.726013729 +0000 UTC m=+9403.211916102" Feb 01 09:24:13 crc kubenswrapper[5127]: I0201 09:24:13.380071 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:13 crc kubenswrapper[5127]: I0201 09:24:13.380612 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:13 crc kubenswrapper[5127]: I0201 09:24:13.704146 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9fjk" event={"ID":"09e86629-5c38-4d04-84ce-284c8ff57e70","Type":"ContainerStarted","Data":"7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469"} Feb 01 09:24:13 crc kubenswrapper[5127]: I0201 09:24:13.727192 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q9fjk" podStartSLOduration=3.282025328 podStartE2EDuration="6.727176668s" podCreationTimestamp="2026-02-01 09:24:07 +0000 UTC" firstStartedPulling="2026-02-01 09:24:09.621715434 +0000 UTC m=+9400.107617797" lastFinishedPulling="2026-02-01 09:24:13.066866784 +0000 UTC m=+9403.552769137" observedRunningTime="2026-02-01 09:24:13.724366972 +0000 UTC m=+9404.210269345" watchObservedRunningTime="2026-02-01 09:24:13.727176668 +0000 UTC m=+9404.213079041" Feb 01 09:24:14 crc kubenswrapper[5127]: I0201 09:24:14.435828 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tx9d9" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="registry-server" probeResult="failure" output=< Feb 01 09:24:14 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:24:14 crc kubenswrapper[5127]: > Feb 01 09:24:17 crc kubenswrapper[5127]: I0201 09:24:17.885664 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:17 crc kubenswrapper[5127]: I0201 09:24:17.886343 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:17 crc kubenswrapper[5127]: I0201 09:24:17.938448 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:18 crc kubenswrapper[5127]: I0201 09:24:18.829978 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:21 crc kubenswrapper[5127]: I0201 09:24:21.551265 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9fjk"] Feb 01 09:24:21 crc kubenswrapper[5127]: I0201 09:24:21.551943 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q9fjk" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="registry-server" containerID="cri-o://7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469" gracePeriod=2 Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.715735 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.886257 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vkwp\" (UniqueName: \"kubernetes.io/projected/09e86629-5c38-4d04-84ce-284c8ff57e70-kube-api-access-8vkwp\") pod \"09e86629-5c38-4d04-84ce-284c8ff57e70\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.886416 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-utilities\") pod \"09e86629-5c38-4d04-84ce-284c8ff57e70\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.886573 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-catalog-content\") pod \"09e86629-5c38-4d04-84ce-284c8ff57e70\" (UID: \"09e86629-5c38-4d04-84ce-284c8ff57e70\") " Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.896312 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-utilities" (OuterVolumeSpecName: "utilities") pod "09e86629-5c38-4d04-84ce-284c8ff57e70" (UID: "09e86629-5c38-4d04-84ce-284c8ff57e70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.902532 5127 generic.go:334] "Generic (PLEG): container finished" podID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerID="7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469" exitCode=0 Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.902599 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9fjk" event={"ID":"09e86629-5c38-4d04-84ce-284c8ff57e70","Type":"ContainerDied","Data":"7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469"} Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.902642 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q9fjk" event={"ID":"09e86629-5c38-4d04-84ce-284c8ff57e70","Type":"ContainerDied","Data":"8f522a9f1b5f53f1bfa44aabdb16dd5217acfaa8d503c8917dbeffe41c6359d9"} Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.902667 5127 scope.go:117] "RemoveContainer" containerID="7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.903070 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q9fjk" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.905770 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e86629-5c38-4d04-84ce-284c8ff57e70-kube-api-access-8vkwp" (OuterVolumeSpecName: "kube-api-access-8vkwp") pod "09e86629-5c38-4d04-84ce-284c8ff57e70" (UID: "09e86629-5c38-4d04-84ce-284c8ff57e70"). InnerVolumeSpecName "kube-api-access-8vkwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.948693 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09e86629-5c38-4d04-84ce-284c8ff57e70" (UID: "09e86629-5c38-4d04-84ce-284c8ff57e70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.967933 5127 scope.go:117] "RemoveContainer" containerID="b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.991841 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.991885 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vkwp\" (UniqueName: \"kubernetes.io/projected/09e86629-5c38-4d04-84ce-284c8ff57e70-kube-api-access-8vkwp\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:22 crc kubenswrapper[5127]: I0201 09:24:22.991901 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e86629-5c38-4d04-84ce-284c8ff57e70-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.003876 5127 scope.go:117] "RemoveContainer" containerID="26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.074874 5127 scope.go:117] "RemoveContainer" containerID="7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469" Feb 01 09:24:23 crc kubenswrapper[5127]: E0201 09:24:23.075336 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469\": container with ID starting with 7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469 not found: ID does not exist" containerID="7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.075373 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469"} err="failed to get container status \"7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469\": rpc error: code = NotFound desc = could not find container \"7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469\": container with ID starting with 7fdd7f7b6d9141f394896fe1b843b9efc251b00f445e9eaba38e80b0fdd6b469 not found: ID does not exist" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.075398 5127 scope.go:117] "RemoveContainer" containerID="b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58" Feb 01 09:24:23 crc kubenswrapper[5127]: E0201 09:24:23.075666 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58\": container with ID starting with b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58 not found: ID does not exist" containerID="b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.075702 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58"} err="failed to get container status \"b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58\": rpc error: code = NotFound desc = could not find container \"b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58\": container with ID starting with b11fe5dd5399d2e64679bafe31ecd9582921a212941992b87a9212f246f69c58 not found: ID does not exist" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.075723 5127 scope.go:117] "RemoveContainer" containerID="26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6" Feb 01 09:24:23 crc kubenswrapper[5127]: E0201 09:24:23.075978 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6\": container with ID starting with 26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6 not found: ID does not exist" containerID="26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.076010 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6"} err="failed to get container status \"26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6\": rpc error: code = NotFound desc = could not find container \"26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6\": container with ID starting with 26fa3de42d47fbcbde4b9d5c692a083179d5a8f39765df6d0c78bed9ce37d9e6 not found: ID does not exist" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.244980 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q9fjk"] Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.253996 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q9fjk"] Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.463525 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:23 crc kubenswrapper[5127]: I0201 09:24:23.526438 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:24 crc kubenswrapper[5127]: I0201 09:24:24.260622 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" path="/var/lib/kubelet/pods/09e86629-5c38-4d04-84ce-284c8ff57e70/volumes" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.146098 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tx9d9"] Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.147969 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tx9d9" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="registry-server" containerID="cri-o://68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0" gracePeriod=2 Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.639082 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.743926 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-utilities\") pod \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.744038 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-catalog-content\") pod \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.744258 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crk6x\" (UniqueName: \"kubernetes.io/projected/300ec1c3-2f4a-424b-9479-40fdd2f505bd-kube-api-access-crk6x\") pod \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\" (UID: \"300ec1c3-2f4a-424b-9479-40fdd2f505bd\") " Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.744980 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-utilities" (OuterVolumeSpecName: "utilities") pod "300ec1c3-2f4a-424b-9479-40fdd2f505bd" (UID: "300ec1c3-2f4a-424b-9479-40fdd2f505bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.749932 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300ec1c3-2f4a-424b-9479-40fdd2f505bd-kube-api-access-crk6x" (OuterVolumeSpecName: "kube-api-access-crk6x") pod "300ec1c3-2f4a-424b-9479-40fdd2f505bd" (UID: "300ec1c3-2f4a-424b-9479-40fdd2f505bd"). InnerVolumeSpecName "kube-api-access-crk6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.847283 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.847328 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crk6x\" (UniqueName: \"kubernetes.io/projected/300ec1c3-2f4a-424b-9479-40fdd2f505bd-kube-api-access-crk6x\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.879321 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "300ec1c3-2f4a-424b-9479-40fdd2f505bd" (UID: "300ec1c3-2f4a-424b-9479-40fdd2f505bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.949182 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/300ec1c3-2f4a-424b-9479-40fdd2f505bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.976918 5127 generic.go:334] "Generic (PLEG): container finished" podID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerID="68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0" exitCode=0 Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.976961 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx9d9" event={"ID":"300ec1c3-2f4a-424b-9479-40fdd2f505bd","Type":"ContainerDied","Data":"68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0"} Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.976990 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx9d9" event={"ID":"300ec1c3-2f4a-424b-9479-40fdd2f505bd","Type":"ContainerDied","Data":"c7c24f561338a22e135d1fd4d220a78c4acca850b9f22cc2ce482ccaa69ad053"} Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.977008 5127 scope.go:117] "RemoveContainer" containerID="68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0" Feb 01 09:24:28 crc kubenswrapper[5127]: I0201 09:24:28.977015 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx9d9" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.011505 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tx9d9"] Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.024033 5127 scope.go:117] "RemoveContainer" containerID="17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.024541 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tx9d9"] Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.054819 5127 scope.go:117] "RemoveContainer" containerID="9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.104791 5127 scope.go:117] "RemoveContainer" containerID="68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0" Feb 01 09:24:29 crc kubenswrapper[5127]: E0201 09:24:29.105322 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0\": container with ID starting with 68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0 not found: ID does not exist" containerID="68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.105366 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0"} err="failed to get container status \"68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0\": rpc error: code = NotFound desc = could not find container \"68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0\": container with ID starting with 68835f26cc8670c6dcb641ef259665b0a985d675e72e12bef68b6cdc0fb1dcc0 not found: ID does not exist" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.105391 5127 scope.go:117] "RemoveContainer" containerID="17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724" Feb 01 09:24:29 crc kubenswrapper[5127]: E0201 09:24:29.105708 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724\": container with ID starting with 17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724 not found: ID does not exist" containerID="17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.105738 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724"} err="failed to get container status \"17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724\": rpc error: code = NotFound desc = could not find container \"17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724\": container with ID starting with 17913ca0dd96b225dd7c3beb1c575da67b3d95ba3e490bc9838b31d2f6d0d724 not found: ID does not exist" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.105753 5127 scope.go:117] "RemoveContainer" containerID="9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43" Feb 01 09:24:29 crc kubenswrapper[5127]: E0201 09:24:29.105937 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43\": container with ID starting with 9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43 not found: ID does not exist" containerID="9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43" Feb 01 09:24:29 crc kubenswrapper[5127]: I0201 09:24:29.105957 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43"} err="failed to get container status \"9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43\": rpc error: code = NotFound desc = could not find container \"9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43\": container with ID starting with 9417d55bb9d12824ea0e0d7914001aa03a32b9ccea009bc370878c0bdae18d43 not found: ID does not exist" Feb 01 09:24:30 crc kubenswrapper[5127]: I0201 09:24:30.251502 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" path="/var/lib/kubelet/pods/300ec1c3-2f4a-424b-9479-40fdd2f505bd/volumes" Feb 01 09:26:36 crc kubenswrapper[5127]: I0201 09:26:36.741070 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:26:36 crc kubenswrapper[5127]: I0201 09:26:36.742010 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:27:06 crc kubenswrapper[5127]: I0201 09:27:06.741318 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:27:06 crc kubenswrapper[5127]: I0201 09:27:06.742132 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:27:32 crc kubenswrapper[5127]: I0201 09:27:32.021462 5127 trace.go:236] Trace[1680176171]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-1" (01-Feb-2026 09:27:30.998) (total time: 1022ms): Feb 01 09:27:32 crc kubenswrapper[5127]: Trace[1680176171]: [1.022571584s] [1.022571584s] END Feb 01 09:27:36 crc kubenswrapper[5127]: I0201 09:27:36.740880 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:27:36 crc kubenswrapper[5127]: I0201 09:27:36.743635 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:27:36 crc kubenswrapper[5127]: I0201 09:27:36.743907 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:27:36 crc kubenswrapper[5127]: I0201 09:27:36.745347 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:27:36 crc kubenswrapper[5127]: I0201 09:27:36.745892 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" gracePeriod=600 Feb 01 09:27:36 crc kubenswrapper[5127]: E0201 09:27:36.886946 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:27:37 crc kubenswrapper[5127]: I0201 09:27:37.399680 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" exitCode=0 Feb 01 09:27:37 crc kubenswrapper[5127]: I0201 09:27:37.399742 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19"} Feb 01 09:27:37 crc kubenswrapper[5127]: I0201 09:27:37.399796 5127 scope.go:117] "RemoveContainer" containerID="5ec3c5052e71917ccc99909614655642e6621d22fd6ecb3067b15ba57f4e19ae" Feb 01 09:27:37 crc kubenswrapper[5127]: I0201 09:27:37.400771 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:27:37 crc kubenswrapper[5127]: E0201 09:27:37.401286 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:27:51 crc kubenswrapper[5127]: I0201 09:27:51.236624 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:27:51 crc kubenswrapper[5127]: E0201 09:27:51.237384 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:28:03 crc kubenswrapper[5127]: I0201 09:28:03.236021 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:28:03 crc kubenswrapper[5127]: E0201 09:28:03.236957 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:28:16 crc kubenswrapper[5127]: I0201 09:28:16.236255 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:28:16 crc kubenswrapper[5127]: E0201 09:28:16.237095 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:28:28 crc kubenswrapper[5127]: I0201 09:28:28.235533 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:28:28 crc kubenswrapper[5127]: E0201 09:28:28.236599 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:28:40 crc kubenswrapper[5127]: I0201 09:28:40.243250 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:28:40 crc kubenswrapper[5127]: E0201 09:28:40.244444 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:28:53 crc kubenswrapper[5127]: I0201 09:28:53.236853 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:28:53 crc kubenswrapper[5127]: E0201 09:28:53.238960 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:29:06 crc kubenswrapper[5127]: I0201 09:29:06.235934 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:29:06 crc kubenswrapper[5127]: E0201 09:29:06.237117 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:29:19 crc kubenswrapper[5127]: I0201 09:29:19.237207 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:29:19 crc kubenswrapper[5127]: E0201 09:29:19.238168 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:29:31 crc kubenswrapper[5127]: I0201 09:29:31.236311 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:29:31 crc kubenswrapper[5127]: E0201 09:29:31.237425 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.144969 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bb5x5"] Feb 01 09:29:41 crc kubenswrapper[5127]: E0201 09:29:41.145935 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="registry-server" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.145951 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="registry-server" Feb 01 09:29:41 crc kubenswrapper[5127]: E0201 09:29:41.145963 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="extract-content" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.145969 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="extract-content" Feb 01 09:29:41 crc kubenswrapper[5127]: E0201 09:29:41.145982 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="extract-content" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.145990 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="extract-content" Feb 01 09:29:41 crc kubenswrapper[5127]: E0201 09:29:41.146004 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="extract-utilities" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.146011 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="extract-utilities" Feb 01 09:29:41 crc kubenswrapper[5127]: E0201 09:29:41.146027 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="registry-server" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.146032 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="registry-server" Feb 01 09:29:41 crc kubenswrapper[5127]: E0201 09:29:41.146047 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="extract-utilities" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.146053 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="extract-utilities" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.146232 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="300ec1c3-2f4a-424b-9479-40fdd2f505bd" containerName="registry-server" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.146251 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="09e86629-5c38-4d04-84ce-284c8ff57e70" containerName="registry-server" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.147725 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.167963 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-catalog-content\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.168157 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67b4\" (UniqueName: \"kubernetes.io/projected/ca437631-505b-404e-85d8-076575d15072-kube-api-access-x67b4\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.168358 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-utilities\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.168975 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb5x5"] Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.271055 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x67b4\" (UniqueName: \"kubernetes.io/projected/ca437631-505b-404e-85d8-076575d15072-kube-api-access-x67b4\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.271188 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-utilities\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.276922 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-catalog-content\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.277932 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-catalog-content\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.278047 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-utilities\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.295817 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67b4\" (UniqueName: \"kubernetes.io/projected/ca437631-505b-404e-85d8-076575d15072-kube-api-access-x67b4\") pod \"redhat-marketplace-bb5x5\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.341042 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-td7ss"] Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.343107 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.361401 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-td7ss"] Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.388005 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-utilities\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.388135 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-catalog-content\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.388202 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp8hd\" (UniqueName: \"kubernetes.io/projected/b0d0e67e-5faf-4054-937b-765faf01814c-kube-api-access-lp8hd\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.490712 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-catalog-content\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.490861 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp8hd\" (UniqueName: \"kubernetes.io/projected/b0d0e67e-5faf-4054-937b-765faf01814c-kube-api-access-lp8hd\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.490934 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-utilities\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.491710 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-catalog-content\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.491848 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-utilities\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.496200 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.535820 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp8hd\" (UniqueName: \"kubernetes.io/projected/b0d0e67e-5faf-4054-937b-765faf01814c-kube-api-access-lp8hd\") pod \"community-operators-td7ss\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:41 crc kubenswrapper[5127]: I0201 09:29:41.665167 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:42 crc kubenswrapper[5127]: W0201 09:29:42.077050 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca437631_505b_404e_85d8_076575d15072.slice/crio-1ed23a45d82cbc9dc851dc31a899f2b33952cdaaca18ca546c7375cf64c5d24e WatchSource:0}: Error finding container 1ed23a45d82cbc9dc851dc31a899f2b33952cdaaca18ca546c7375cf64c5d24e: Status 404 returned error can't find the container with id 1ed23a45d82cbc9dc851dc31a899f2b33952cdaaca18ca546c7375cf64c5d24e Feb 01 09:29:42 crc kubenswrapper[5127]: I0201 09:29:42.083517 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb5x5"] Feb 01 09:29:42 crc kubenswrapper[5127]: I0201 09:29:42.250167 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerStarted","Data":"3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099"} Feb 01 09:29:42 crc kubenswrapper[5127]: I0201 09:29:42.250225 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerStarted","Data":"1ed23a45d82cbc9dc851dc31a899f2b33952cdaaca18ca546c7375cf64c5d24e"} Feb 01 09:29:42 crc kubenswrapper[5127]: I0201 09:29:42.417260 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-td7ss"] Feb 01 09:29:42 crc kubenswrapper[5127]: W0201 09:29:42.422313 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d0e67e_5faf_4054_937b_765faf01814c.slice/crio-89e557cf1d6d5275ed412b59ef8a4181258ea0cd69e790771a8673ca6059a002 WatchSource:0}: Error finding container 89e557cf1d6d5275ed412b59ef8a4181258ea0cd69e790771a8673ca6059a002: Status 404 returned error can't find the container with id 89e557cf1d6d5275ed412b59ef8a4181258ea0cd69e790771a8673ca6059a002 Feb 01 09:29:43 crc kubenswrapper[5127]: I0201 09:29:43.261042 5127 generic.go:334] "Generic (PLEG): container finished" podID="ca437631-505b-404e-85d8-076575d15072" containerID="3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099" exitCode=0 Feb 01 09:29:43 crc kubenswrapper[5127]: I0201 09:29:43.261159 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerDied","Data":"3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099"} Feb 01 09:29:43 crc kubenswrapper[5127]: I0201 09:29:43.263706 5127 generic.go:334] "Generic (PLEG): container finished" podID="b0d0e67e-5faf-4054-937b-765faf01814c" containerID="e80111d28a76d7019998d1b6bd6c169b7e283faf9ee0c3df18ea70a001a4caa9" exitCode=0 Feb 01 09:29:43 crc kubenswrapper[5127]: I0201 09:29:43.263748 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td7ss" event={"ID":"b0d0e67e-5faf-4054-937b-765faf01814c","Type":"ContainerDied","Data":"e80111d28a76d7019998d1b6bd6c169b7e283faf9ee0c3df18ea70a001a4caa9"} Feb 01 09:29:43 crc kubenswrapper[5127]: I0201 09:29:43.263778 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td7ss" event={"ID":"b0d0e67e-5faf-4054-937b-765faf01814c","Type":"ContainerStarted","Data":"89e557cf1d6d5275ed412b59ef8a4181258ea0cd69e790771a8673ca6059a002"} Feb 01 09:29:43 crc kubenswrapper[5127]: I0201 09:29:43.264713 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:29:44 crc kubenswrapper[5127]: I0201 09:29:44.277951 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerStarted","Data":"50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071"} Feb 01 09:29:44 crc kubenswrapper[5127]: I0201 09:29:44.279950 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td7ss" event={"ID":"b0d0e67e-5faf-4054-937b-765faf01814c","Type":"ContainerStarted","Data":"b2eec23d93cccccfb952820d3a34f9f3f500b8553c017095c5b6fa247f03f65a"} Feb 01 09:29:45 crc kubenswrapper[5127]: I0201 09:29:45.245893 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:29:45 crc kubenswrapper[5127]: E0201 09:29:45.246743 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:29:45 crc kubenswrapper[5127]: I0201 09:29:45.297243 5127 generic.go:334] "Generic (PLEG): container finished" podID="ca437631-505b-404e-85d8-076575d15072" containerID="50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071" exitCode=0 Feb 01 09:29:45 crc kubenswrapper[5127]: I0201 09:29:45.297308 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerDied","Data":"50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071"} Feb 01 09:29:45 crc kubenswrapper[5127]: I0201 09:29:45.301538 5127 generic.go:334] "Generic (PLEG): container finished" podID="b0d0e67e-5faf-4054-937b-765faf01814c" containerID="b2eec23d93cccccfb952820d3a34f9f3f500b8553c017095c5b6fa247f03f65a" exitCode=0 Feb 01 09:29:45 crc kubenswrapper[5127]: I0201 09:29:45.301611 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td7ss" event={"ID":"b0d0e67e-5faf-4054-937b-765faf01814c","Type":"ContainerDied","Data":"b2eec23d93cccccfb952820d3a34f9f3f500b8553c017095c5b6fa247f03f65a"} Feb 01 09:29:47 crc kubenswrapper[5127]: I0201 09:29:47.337872 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td7ss" event={"ID":"b0d0e67e-5faf-4054-937b-765faf01814c","Type":"ContainerStarted","Data":"07addec9838f16da8aa7e394816cda19c175a1b7f4f14b5e952799c2f5f66203"} Feb 01 09:29:47 crc kubenswrapper[5127]: I0201 09:29:47.342454 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerStarted","Data":"8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164"} Feb 01 09:29:47 crc kubenswrapper[5127]: I0201 09:29:47.360214 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-td7ss" podStartSLOduration=3.868841921 podStartE2EDuration="6.360196013s" podCreationTimestamp="2026-02-01 09:29:41 +0000 UTC" firstStartedPulling="2026-02-01 09:29:43.266073934 +0000 UTC m=+9733.751976337" lastFinishedPulling="2026-02-01 09:29:45.757428036 +0000 UTC m=+9736.243330429" observedRunningTime="2026-02-01 09:29:47.357554422 +0000 UTC m=+9737.843456825" watchObservedRunningTime="2026-02-01 09:29:47.360196013 +0000 UTC m=+9737.846098376" Feb 01 09:29:47 crc kubenswrapper[5127]: I0201 09:29:47.388222 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bb5x5" podStartSLOduration=3.944276215 podStartE2EDuration="6.388204145s" podCreationTimestamp="2026-02-01 09:29:41 +0000 UTC" firstStartedPulling="2026-02-01 09:29:43.264381648 +0000 UTC m=+9733.750284021" lastFinishedPulling="2026-02-01 09:29:45.708309538 +0000 UTC m=+9736.194211951" observedRunningTime="2026-02-01 09:29:47.38350917 +0000 UTC m=+9737.869411533" watchObservedRunningTime="2026-02-01 09:29:47.388204145 +0000 UTC m=+9737.874106508" Feb 01 09:29:51 crc kubenswrapper[5127]: I0201 09:29:51.497366 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:51 crc kubenswrapper[5127]: I0201 09:29:51.498345 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:51 crc kubenswrapper[5127]: I0201 09:29:51.563658 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:51 crc kubenswrapper[5127]: I0201 09:29:51.677910 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:51 crc kubenswrapper[5127]: I0201 09:29:51.677990 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:51 crc kubenswrapper[5127]: I0201 09:29:51.755682 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:52 crc kubenswrapper[5127]: I0201 09:29:52.454795 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:52 crc kubenswrapper[5127]: I0201 09:29:52.481944 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:53 crc kubenswrapper[5127]: I0201 09:29:53.352062 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-td7ss"] Feb 01 09:29:54 crc kubenswrapper[5127]: I0201 09:29:54.427486 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-td7ss" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="registry-server" containerID="cri-o://07addec9838f16da8aa7e394816cda19c175a1b7f4f14b5e952799c2f5f66203" gracePeriod=2 Feb 01 09:29:54 crc kubenswrapper[5127]: I0201 09:29:54.735839 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb5x5"] Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.439357 5127 generic.go:334] "Generic (PLEG): container finished" podID="b0d0e67e-5faf-4054-937b-765faf01814c" containerID="07addec9838f16da8aa7e394816cda19c175a1b7f4f14b5e952799c2f5f66203" exitCode=0 Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.439873 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bb5x5" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="registry-server" containerID="cri-o://8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164" gracePeriod=2 Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.439568 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td7ss" event={"ID":"b0d0e67e-5faf-4054-937b-765faf01814c","Type":"ContainerDied","Data":"07addec9838f16da8aa7e394816cda19c175a1b7f4f14b5e952799c2f5f66203"} Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.681689 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.816387 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-utilities\") pod \"b0d0e67e-5faf-4054-937b-765faf01814c\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.816781 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp8hd\" (UniqueName: \"kubernetes.io/projected/b0d0e67e-5faf-4054-937b-765faf01814c-kube-api-access-lp8hd\") pod \"b0d0e67e-5faf-4054-937b-765faf01814c\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.816895 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-catalog-content\") pod \"b0d0e67e-5faf-4054-937b-765faf01814c\" (UID: \"b0d0e67e-5faf-4054-937b-765faf01814c\") " Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.817442 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-utilities" (OuterVolumeSpecName: "utilities") pod "b0d0e67e-5faf-4054-937b-765faf01814c" (UID: "b0d0e67e-5faf-4054-937b-765faf01814c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.826991 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d0e67e-5faf-4054-937b-765faf01814c-kube-api-access-lp8hd" (OuterVolumeSpecName: "kube-api-access-lp8hd") pod "b0d0e67e-5faf-4054-937b-765faf01814c" (UID: "b0d0e67e-5faf-4054-937b-765faf01814c"). InnerVolumeSpecName "kube-api-access-lp8hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.887925 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0d0e67e-5faf-4054-937b-765faf01814c" (UID: "b0d0e67e-5faf-4054-937b-765faf01814c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.919696 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.919733 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp8hd\" (UniqueName: \"kubernetes.io/projected/b0d0e67e-5faf-4054-937b-765faf01814c-kube-api-access-lp8hd\") on node \"crc\" DevicePath \"\"" Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.919744 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0d0e67e-5faf-4054-937b-765faf01814c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:29:55 crc kubenswrapper[5127]: I0201 09:29:55.931171 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.020852 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-catalog-content\") pod \"ca437631-505b-404e-85d8-076575d15072\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.020920 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x67b4\" (UniqueName: \"kubernetes.io/projected/ca437631-505b-404e-85d8-076575d15072-kube-api-access-x67b4\") pod \"ca437631-505b-404e-85d8-076575d15072\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.020995 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-utilities\") pod \"ca437631-505b-404e-85d8-076575d15072\" (UID: \"ca437631-505b-404e-85d8-076575d15072\") " Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.022086 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-utilities" (OuterVolumeSpecName: "utilities") pod "ca437631-505b-404e-85d8-076575d15072" (UID: "ca437631-505b-404e-85d8-076575d15072"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.027788 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca437631-505b-404e-85d8-076575d15072-kube-api-access-x67b4" (OuterVolumeSpecName: "kube-api-access-x67b4") pod "ca437631-505b-404e-85d8-076575d15072" (UID: "ca437631-505b-404e-85d8-076575d15072"). InnerVolumeSpecName "kube-api-access-x67b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.055396 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca437631-505b-404e-85d8-076575d15072" (UID: "ca437631-505b-404e-85d8-076575d15072"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.123390 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.123432 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x67b4\" (UniqueName: \"kubernetes.io/projected/ca437631-505b-404e-85d8-076575d15072-kube-api-access-x67b4\") on node \"crc\" DevicePath \"\"" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.123447 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca437631-505b-404e-85d8-076575d15072-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.451441 5127 generic.go:334] "Generic (PLEG): container finished" podID="ca437631-505b-404e-85d8-076575d15072" containerID="8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164" exitCode=0 Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.451498 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerDied","Data":"8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164"} Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.451900 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bb5x5" event={"ID":"ca437631-505b-404e-85d8-076575d15072","Type":"ContainerDied","Data":"1ed23a45d82cbc9dc851dc31a899f2b33952cdaaca18ca546c7375cf64c5d24e"} Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.451549 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bb5x5" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.451936 5127 scope.go:117] "RemoveContainer" containerID="8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.455222 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-td7ss" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.455874 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-td7ss" event={"ID":"b0d0e67e-5faf-4054-937b-765faf01814c","Type":"ContainerDied","Data":"89e557cf1d6d5275ed412b59ef8a4181258ea0cd69e790771a8673ca6059a002"} Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.485987 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb5x5"] Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.488205 5127 scope.go:117] "RemoveContainer" containerID="50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.505374 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bb5x5"] Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.515203 5127 scope.go:117] "RemoveContainer" containerID="3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.518096 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-td7ss"] Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.529845 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-td7ss"] Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.581573 5127 scope.go:117] "RemoveContainer" containerID="8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164" Feb 01 09:29:56 crc kubenswrapper[5127]: E0201 09:29:56.582220 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164\": container with ID starting with 8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164 not found: ID does not exist" containerID="8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.582312 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164"} err="failed to get container status \"8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164\": rpc error: code = NotFound desc = could not find container \"8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164\": container with ID starting with 8f4bcea527fd03850c10d4a0c1a6f72e333815dcc102ac10cb1d6d3c3dbb2164 not found: ID does not exist" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.582367 5127 scope.go:117] "RemoveContainer" containerID="50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071" Feb 01 09:29:56 crc kubenswrapper[5127]: E0201 09:29:56.582931 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071\": container with ID starting with 50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071 not found: ID does not exist" containerID="50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.582963 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071"} err="failed to get container status \"50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071\": rpc error: code = NotFound desc = could not find container \"50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071\": container with ID starting with 50c75fbb8ce52a31d904140f37ada58210db3ae872ad0e22866f209cfb06c071 not found: ID does not exist" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.582987 5127 scope.go:117] "RemoveContainer" containerID="3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099" Feb 01 09:29:56 crc kubenswrapper[5127]: E0201 09:29:56.583484 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099\": container with ID starting with 3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099 not found: ID does not exist" containerID="3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.583552 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099"} err="failed to get container status \"3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099\": rpc error: code = NotFound desc = could not find container \"3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099\": container with ID starting with 3047429bf1ca0aa3f744c54ed4a3dee9b88178f0ba70e8236c42d310691ef099 not found: ID does not exist" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.583617 5127 scope.go:117] "RemoveContainer" containerID="07addec9838f16da8aa7e394816cda19c175a1b7f4f14b5e952799c2f5f66203" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.623238 5127 scope.go:117] "RemoveContainer" containerID="b2eec23d93cccccfb952820d3a34f9f3f500b8553c017095c5b6fa247f03f65a" Feb 01 09:29:56 crc kubenswrapper[5127]: I0201 09:29:56.661722 5127 scope.go:117] "RemoveContainer" containerID="e80111d28a76d7019998d1b6bd6c169b7e283faf9ee0c3df18ea70a001a4caa9" Feb 01 09:29:57 crc kubenswrapper[5127]: I0201 09:29:57.236361 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:29:57 crc kubenswrapper[5127]: E0201 09:29:57.236907 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:29:58 crc kubenswrapper[5127]: I0201 09:29:58.251778 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" path="/var/lib/kubelet/pods/b0d0e67e-5faf-4054-937b-765faf01814c/volumes" Feb 01 09:29:58 crc kubenswrapper[5127]: I0201 09:29:58.253467 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca437631-505b-404e-85d8-076575d15072" path="/var/lib/kubelet/pods/ca437631-505b-404e-85d8-076575d15072/volumes" Feb 01 09:29:59 crc kubenswrapper[5127]: I0201 09:29:59.542423 5127 generic.go:334] "Generic (PLEG): container finished" podID="31d637d0-729b-42fe-8cbc-1ed2c449c2b3" containerID="84f6e42236d1255385f5b790afdb50b40990e26295e5da618283977dfc99bf49" exitCode=0 Feb 01 09:29:59 crc kubenswrapper[5127]: I0201 09:29:59.542620 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" event={"ID":"31d637d0-729b-42fe-8cbc-1ed2c449c2b3","Type":"ContainerDied","Data":"84f6e42236d1255385f5b790afdb50b40990e26295e5da618283977dfc99bf49"} Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169024 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs"] Feb 01 09:30:00 crc kubenswrapper[5127]: E0201 09:30:00.169441 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="extract-content" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169458 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="extract-content" Feb 01 09:30:00 crc kubenswrapper[5127]: E0201 09:30:00.169490 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="extract-utilities" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169497 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="extract-utilities" Feb 01 09:30:00 crc kubenswrapper[5127]: E0201 09:30:00.169510 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="registry-server" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169516 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="registry-server" Feb 01 09:30:00 crc kubenswrapper[5127]: E0201 09:30:00.169536 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="extract-content" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169542 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="extract-content" Feb 01 09:30:00 crc kubenswrapper[5127]: E0201 09:30:00.169554 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="registry-server" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169559 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="registry-server" Feb 01 09:30:00 crc kubenswrapper[5127]: E0201 09:30:00.169570 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="extract-utilities" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169588 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="extract-utilities" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169768 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d0e67e-5faf-4054-937b-765faf01814c" containerName="registry-server" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.169791 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca437631-505b-404e-85d8-076575d15072" containerName="registry-server" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.171403 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.177201 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.177303 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.197130 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs"] Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.322565 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj67j\" (UniqueName: \"kubernetes.io/projected/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-kube-api-access-gj67j\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.322677 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-secret-volume\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.322812 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-config-volume\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.424293 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-config-volume\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.424453 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj67j\" (UniqueName: \"kubernetes.io/projected/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-kube-api-access-gj67j\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.424521 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-secret-volume\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.425866 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-config-volume\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.433922 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-secret-volume\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.452074 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj67j\" (UniqueName: \"kubernetes.io/projected/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-kube-api-access-gj67j\") pod \"collect-profiles-29498970-8gmxs\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:00 crc kubenswrapper[5127]: I0201 09:30:00.544547 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.095789 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs"] Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.162326 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.246979 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-0\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.247049 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceph\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.247090 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-2\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.247132 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2z4p\" (UniqueName: \"kubernetes.io/projected/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-kube-api-access-z2z4p\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.247157 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-telemetry-combined-ca-bundle\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.247204 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-1\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.247277 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-inventory\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.247345 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ssh-key-openstack-cell1\") pod \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\" (UID: \"31d637d0-729b-42fe-8cbc-1ed2c449c2b3\") " Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.253476 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceph" (OuterVolumeSpecName: "ceph") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.253505 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-kube-api-access-z2z4p" (OuterVolumeSpecName: "kube-api-access-z2z4p") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "kube-api-access-z2z4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.255502 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.279741 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.283067 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.283299 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-inventory" (OuterVolumeSpecName: "inventory") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.291356 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.291574 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "31d637d0-729b-42fe-8cbc-1ed2c449c2b3" (UID: "31d637d0-729b-42fe-8cbc-1ed2c449c2b3"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.349572 5127 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.349804 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.349866 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.349921 5127 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.350012 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.350079 5127 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.350133 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2z4p\" (UniqueName: \"kubernetes.io/projected/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-kube-api-access-z2z4p\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.350188 5127 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31d637d0-729b-42fe-8cbc-1ed2c449c2b3-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.581088 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" event={"ID":"31d637d0-729b-42fe-8cbc-1ed2c449c2b3","Type":"ContainerDied","Data":"5b3e434d7a56c640f2be4108dea719d7111644485b0589747070bf883320bd3e"} Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.581130 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b3e434d7a56c640f2be4108dea719d7111644485b0589747070bf883320bd3e" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.581228 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-qdr6s" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.584296 5127 generic.go:334] "Generic (PLEG): container finished" podID="02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" containerID="c92eba0eda55bb189b4b464506e74717b39cf1c8baf0b57112dbc6446530f945" exitCode=0 Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.584334 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" event={"ID":"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d","Type":"ContainerDied","Data":"c92eba0eda55bb189b4b464506e74717b39cf1c8baf0b57112dbc6446530f945"} Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.584356 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" event={"ID":"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d","Type":"ContainerStarted","Data":"0da93f8982a40f35089d117fa5fda082cb9cbe6070fe30c1a60bc64edf3b4a2c"} Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.741148 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-74l6h"] Feb 01 09:30:01 crc kubenswrapper[5127]: E0201 09:30:01.741694 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d637d0-729b-42fe-8cbc-1ed2c449c2b3" containerName="telemetry-openstack-openstack-cell1" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.741713 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d637d0-729b-42fe-8cbc-1ed2c449c2b3" containerName="telemetry-openstack-openstack-cell1" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.741918 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d637d0-729b-42fe-8cbc-1ed2c449c2b3" containerName="telemetry-openstack-openstack-cell1" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.742680 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.745308 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.745353 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.745965 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.746008 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.746091 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.753414 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-74l6h"] Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.860204 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.860272 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.860294 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.860634 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.860930 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.861218 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4v42\" (UniqueName: \"kubernetes.io/projected/25c41c63-61a4-4e80-b834-5f90523eb171-kube-api-access-n4v42\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.963641 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4v42\" (UniqueName: \"kubernetes.io/projected/25c41c63-61a4-4e80-b834-5f90523eb171-kube-api-access-n4v42\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.963853 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.963974 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.964012 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.964133 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.964217 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.970462 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.970886 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.971107 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.971396 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.979531 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:01 crc kubenswrapper[5127]: I0201 09:30:01.989306 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4v42\" (UniqueName: \"kubernetes.io/projected/25c41c63-61a4-4e80-b834-5f90523eb171-kube-api-access-n4v42\") pod \"neutron-sriov-openstack-openstack-cell1-74l6h\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:02 crc kubenswrapper[5127]: I0201 09:30:02.057256 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:02 crc kubenswrapper[5127]: I0201 09:30:02.672718 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-74l6h"] Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.075163 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.194133 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-secret-volume\") pod \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.194366 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-config-volume\") pod \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.194490 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj67j\" (UniqueName: \"kubernetes.io/projected/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-kube-api-access-gj67j\") pod \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\" (UID: \"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d\") " Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.194943 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-config-volume" (OuterVolumeSpecName: "config-volume") pod "02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" (UID: "02d20aa4-8168-4dc3-8b16-a3d21ca6b45d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.195083 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.201065 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" (UID: "02d20aa4-8168-4dc3-8b16-a3d21ca6b45d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.212391 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-kube-api-access-gj67j" (OuterVolumeSpecName: "kube-api-access-gj67j") pod "02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" (UID: "02d20aa4-8168-4dc3-8b16-a3d21ca6b45d"). InnerVolumeSpecName "kube-api-access-gj67j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.297052 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.297084 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj67j\" (UniqueName: \"kubernetes.io/projected/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d-kube-api-access-gj67j\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.609901 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" event={"ID":"25c41c63-61a4-4e80-b834-5f90523eb171","Type":"ContainerStarted","Data":"fa12bd57728f41fad81f130794b13777f9bed3ae4277c4347ad0e7a47c61cd8c"} Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.610362 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" event={"ID":"25c41c63-61a4-4e80-b834-5f90523eb171","Type":"ContainerStarted","Data":"c3af90f6fbfcc2a74c159b8337238a46f885de8f2fdd1691572baf6fe8410ef8"} Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.611811 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" event={"ID":"02d20aa4-8168-4dc3-8b16-a3d21ca6b45d","Type":"ContainerDied","Data":"0da93f8982a40f35089d117fa5fda082cb9cbe6070fe30c1a60bc64edf3b4a2c"} Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.611862 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0da93f8982a40f35089d117fa5fda082cb9cbe6070fe30c1a60bc64edf3b4a2c" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.612051 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs" Feb 01 09:30:03 crc kubenswrapper[5127]: I0201 09:30:03.647095 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" podStartSLOduration=2.076043738 podStartE2EDuration="2.647073825s" podCreationTimestamp="2026-02-01 09:30:01 +0000 UTC" firstStartedPulling="2026-02-01 09:30:02.687907514 +0000 UTC m=+9753.173809927" lastFinishedPulling="2026-02-01 09:30:03.258937651 +0000 UTC m=+9753.744840014" observedRunningTime="2026-02-01 09:30:03.633527471 +0000 UTC m=+9754.119429854" watchObservedRunningTime="2026-02-01 09:30:03.647073825 +0000 UTC m=+9754.132976198" Feb 01 09:30:04 crc kubenswrapper[5127]: I0201 09:30:04.179300 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l"] Feb 01 09:30:04 crc kubenswrapper[5127]: I0201 09:30:04.188700 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498925-dqn9l"] Feb 01 09:30:04 crc kubenswrapper[5127]: I0201 09:30:04.249005 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a02b0a-9955-45c6-af32-4b7aab3de4cf" path="/var/lib/kubelet/pods/82a02b0a-9955-45c6-af32-4b7aab3de4cf/volumes" Feb 01 09:30:10 crc kubenswrapper[5127]: I0201 09:30:10.244897 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:30:10 crc kubenswrapper[5127]: E0201 09:30:10.245709 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:30:25 crc kubenswrapper[5127]: I0201 09:30:25.236465 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:30:25 crc kubenswrapper[5127]: E0201 09:30:25.237419 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:30:39 crc kubenswrapper[5127]: I0201 09:30:39.236300 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:30:39 crc kubenswrapper[5127]: E0201 09:30:39.237350 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:30:49 crc kubenswrapper[5127]: I0201 09:30:49.390284 5127 generic.go:334] "Generic (PLEG): container finished" podID="25c41c63-61a4-4e80-b834-5f90523eb171" containerID="fa12bd57728f41fad81f130794b13777f9bed3ae4277c4347ad0e7a47c61cd8c" exitCode=0 Feb 01 09:30:49 crc kubenswrapper[5127]: I0201 09:30:49.390399 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" event={"ID":"25c41c63-61a4-4e80-b834-5f90523eb171","Type":"ContainerDied","Data":"fa12bd57728f41fad81f130794b13777f9bed3ae4277c4347ad0e7a47c61cd8c"} Feb 01 09:30:49 crc kubenswrapper[5127]: I0201 09:30:49.647181 5127 scope.go:117] "RemoveContainer" containerID="ac7eb1a45ea242f6bb4b653b4e2f00135d4457b2c2ccfccd6661267abf08e1ef" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.041628 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.137468 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-agent-neutron-config-0\") pod \"25c41c63-61a4-4e80-b834-5f90523eb171\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.137975 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-inventory\") pod \"25c41c63-61a4-4e80-b834-5f90523eb171\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.138128 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ssh-key-openstack-cell1\") pod \"25c41c63-61a4-4e80-b834-5f90523eb171\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.138268 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4v42\" (UniqueName: \"kubernetes.io/projected/25c41c63-61a4-4e80-b834-5f90523eb171-kube-api-access-n4v42\") pod \"25c41c63-61a4-4e80-b834-5f90523eb171\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.138950 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-combined-ca-bundle\") pod \"25c41c63-61a4-4e80-b834-5f90523eb171\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.139128 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ceph\") pod \"25c41c63-61a4-4e80-b834-5f90523eb171\" (UID: \"25c41c63-61a4-4e80-b834-5f90523eb171\") " Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.144483 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ceph" (OuterVolumeSpecName: "ceph") pod "25c41c63-61a4-4e80-b834-5f90523eb171" (UID: "25c41c63-61a4-4e80-b834-5f90523eb171"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.146281 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c41c63-61a4-4e80-b834-5f90523eb171-kube-api-access-n4v42" (OuterVolumeSpecName: "kube-api-access-n4v42") pod "25c41c63-61a4-4e80-b834-5f90523eb171" (UID: "25c41c63-61a4-4e80-b834-5f90523eb171"). InnerVolumeSpecName "kube-api-access-n4v42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.146279 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "25c41c63-61a4-4e80-b834-5f90523eb171" (UID: "25c41c63-61a4-4e80-b834-5f90523eb171"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.169435 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-inventory" (OuterVolumeSpecName: "inventory") pod "25c41c63-61a4-4e80-b834-5f90523eb171" (UID: "25c41c63-61a4-4e80-b834-5f90523eb171"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.170431 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "25c41c63-61a4-4e80-b834-5f90523eb171" (UID: "25c41c63-61a4-4e80-b834-5f90523eb171"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.178414 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "25c41c63-61a4-4e80-b834-5f90523eb171" (UID: "25c41c63-61a4-4e80-b834-5f90523eb171"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.242290 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.242326 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.242338 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4v42\" (UniqueName: \"kubernetes.io/projected/25c41c63-61a4-4e80-b834-5f90523eb171-kube-api-access-n4v42\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.242349 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.242359 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.242370 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/25c41c63-61a4-4e80-b834-5f90523eb171-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.421837 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" event={"ID":"25c41c63-61a4-4e80-b834-5f90523eb171","Type":"ContainerDied","Data":"c3af90f6fbfcc2a74c159b8337238a46f885de8f2fdd1691572baf6fe8410ef8"} Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.421899 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3af90f6fbfcc2a74c159b8337238a46f885de8f2fdd1691572baf6fe8410ef8" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.422029 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-74l6h" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.570687 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2"] Feb 01 09:30:51 crc kubenswrapper[5127]: E0201 09:30:51.571348 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c41c63-61a4-4e80-b834-5f90523eb171" containerName="neutron-sriov-openstack-openstack-cell1" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.571379 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c41c63-61a4-4e80-b834-5f90523eb171" containerName="neutron-sriov-openstack-openstack-cell1" Feb 01 09:30:51 crc kubenswrapper[5127]: E0201 09:30:51.571423 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" containerName="collect-profiles" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.571436 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" containerName="collect-profiles" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.571801 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c41c63-61a4-4e80-b834-5f90523eb171" containerName="neutron-sriov-openstack-openstack-cell1" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.571882 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" containerName="collect-profiles" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.573394 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.576681 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.577137 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.577463 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.578449 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.587424 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.600293 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2"] Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.759457 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvd7k\" (UniqueName: \"kubernetes.io/projected/8a1da379-3eb9-4fa1-abe1-aabf0654833c-kube-api-access-wvd7k\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.760319 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.760401 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.760609 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.760940 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.761011 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.862814 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.863116 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.863238 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.863433 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.863539 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.863714 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvd7k\" (UniqueName: \"kubernetes.io/projected/8a1da379-3eb9-4fa1-abe1-aabf0654833c-kube-api-access-wvd7k\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.870170 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.870635 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.873641 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.882355 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.883011 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.888952 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvd7k\" (UniqueName: \"kubernetes.io/projected/8a1da379-3eb9-4fa1-abe1-aabf0654833c-kube-api-access-wvd7k\") pod \"neutron-dhcp-openstack-openstack-cell1-rkfr2\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:51 crc kubenswrapper[5127]: I0201 09:30:51.897847 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:30:52 crc kubenswrapper[5127]: I0201 09:30:52.236624 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:30:52 crc kubenswrapper[5127]: E0201 09:30:52.237517 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:30:52 crc kubenswrapper[5127]: I0201 09:30:52.512259 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2"] Feb 01 09:30:53 crc kubenswrapper[5127]: I0201 09:30:53.445976 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" event={"ID":"8a1da379-3eb9-4fa1-abe1-aabf0654833c","Type":"ContainerStarted","Data":"28212e5de69d15426f1bd12bad74c7fa377766766945ff1e12ce563d6d5e6723"} Feb 01 09:30:53 crc kubenswrapper[5127]: I0201 09:30:53.446409 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" event={"ID":"8a1da379-3eb9-4fa1-abe1-aabf0654833c","Type":"ContainerStarted","Data":"5e143271483f1e0891903be93ade697b8d94fe3f65320b600fbbdc3ea937153a"} Feb 01 09:30:53 crc kubenswrapper[5127]: I0201 09:30:53.476519 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" podStartSLOduration=1.963479395 podStartE2EDuration="2.476487323s" podCreationTimestamp="2026-02-01 09:30:51 +0000 UTC" firstStartedPulling="2026-02-01 09:30:52.513085328 +0000 UTC m=+9802.998987731" lastFinishedPulling="2026-02-01 09:30:53.026093296 +0000 UTC m=+9803.511995659" observedRunningTime="2026-02-01 09:30:53.465254441 +0000 UTC m=+9803.951156854" watchObservedRunningTime="2026-02-01 09:30:53.476487323 +0000 UTC m=+9803.962389726" Feb 01 09:31:07 crc kubenswrapper[5127]: I0201 09:31:07.236774 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:31:07 crc kubenswrapper[5127]: E0201 09:31:07.237404 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:31:18 crc kubenswrapper[5127]: I0201 09:31:18.236478 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:31:18 crc kubenswrapper[5127]: E0201 09:31:18.237113 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:31:33 crc kubenswrapper[5127]: I0201 09:31:33.236338 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:31:33 crc kubenswrapper[5127]: E0201 09:31:33.237248 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:31:45 crc kubenswrapper[5127]: I0201 09:31:45.598982 5127 trace.go:236] Trace[319131081]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (01-Feb-2026 09:31:44.474) (total time: 1124ms): Feb 01 09:31:45 crc kubenswrapper[5127]: Trace[319131081]: [1.124505142s] [1.124505142s] END Feb 01 09:31:46 crc kubenswrapper[5127]: I0201 09:31:46.235470 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:31:46 crc kubenswrapper[5127]: E0201 09:31:46.236064 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:32:00 crc kubenswrapper[5127]: I0201 09:32:00.249883 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:32:00 crc kubenswrapper[5127]: E0201 09:32:00.250964 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:32:00 crc kubenswrapper[5127]: I0201 09:32:00.340356 5127 generic.go:334] "Generic (PLEG): container finished" podID="8a1da379-3eb9-4fa1-abe1-aabf0654833c" containerID="28212e5de69d15426f1bd12bad74c7fa377766766945ff1e12ce563d6d5e6723" exitCode=0 Feb 01 09:32:00 crc kubenswrapper[5127]: I0201 09:32:00.340444 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" event={"ID":"8a1da379-3eb9-4fa1-abe1-aabf0654833c","Type":"ContainerDied","Data":"28212e5de69d15426f1bd12bad74c7fa377766766945ff1e12ce563d6d5e6723"} Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.934977 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.985908 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-agent-neutron-config-0\") pod \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.985993 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-inventory\") pod \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.986176 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ceph\") pod \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.986211 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvd7k\" (UniqueName: \"kubernetes.io/projected/8a1da379-3eb9-4fa1-abe1-aabf0654833c-kube-api-access-wvd7k\") pod \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.986286 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ssh-key-openstack-cell1\") pod \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.986325 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-combined-ca-bundle\") pod \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\" (UID: \"8a1da379-3eb9-4fa1-abe1-aabf0654833c\") " Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.998804 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ceph" (OuterVolumeSpecName: "ceph") pod "8a1da379-3eb9-4fa1-abe1-aabf0654833c" (UID: "8a1da379-3eb9-4fa1-abe1-aabf0654833c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:01 crc kubenswrapper[5127]: I0201 09:32:01.999723 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "8a1da379-3eb9-4fa1-abe1-aabf0654833c" (UID: "8a1da379-3eb9-4fa1-abe1-aabf0654833c"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.012609 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a1da379-3eb9-4fa1-abe1-aabf0654833c-kube-api-access-wvd7k" (OuterVolumeSpecName: "kube-api-access-wvd7k") pod "8a1da379-3eb9-4fa1-abe1-aabf0654833c" (UID: "8a1da379-3eb9-4fa1-abe1-aabf0654833c"). InnerVolumeSpecName "kube-api-access-wvd7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.033788 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "8a1da379-3eb9-4fa1-abe1-aabf0654833c" (UID: "8a1da379-3eb9-4fa1-abe1-aabf0654833c"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.033968 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8a1da379-3eb9-4fa1-abe1-aabf0654833c" (UID: "8a1da379-3eb9-4fa1-abe1-aabf0654833c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.040795 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-inventory" (OuterVolumeSpecName: "inventory") pod "8a1da379-3eb9-4fa1-abe1-aabf0654833c" (UID: "8a1da379-3eb9-4fa1-abe1-aabf0654833c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.088931 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.088963 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.088976 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.088986 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvd7k\" (UniqueName: \"kubernetes.io/projected/8a1da379-3eb9-4fa1-abe1-aabf0654833c-kube-api-access-wvd7k\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.088996 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.089005 5127 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1da379-3eb9-4fa1-abe1-aabf0654833c-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.365642 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" event={"ID":"8a1da379-3eb9-4fa1-abe1-aabf0654833c","Type":"ContainerDied","Data":"5e143271483f1e0891903be93ade697b8d94fe3f65320b600fbbdc3ea937153a"} Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.366092 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e143271483f1e0891903be93ade697b8d94fe3f65320b600fbbdc3ea937153a" Feb 01 09:32:02 crc kubenswrapper[5127]: I0201 09:32:02.365719 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-rkfr2" Feb 01 09:32:12 crc kubenswrapper[5127]: I0201 09:32:12.235803 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:32:12 crc kubenswrapper[5127]: E0201 09:32:12.237090 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:32:17 crc kubenswrapper[5127]: I0201 09:32:17.590944 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 09:32:17 crc kubenswrapper[5127]: I0201 09:32:17.591800 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="e4ccca7b-584b-4aa9-badd-0438284cfa51" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" gracePeriod=30 Feb 01 09:32:17 crc kubenswrapper[5127]: I0201 09:32:17.628423 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 09:32:17 crc kubenswrapper[5127]: I0201 09:32:17.629454 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="0f1d4e0b-4c49-4323-b3a7-48363d831f2b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c" gracePeriod=30 Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.077346 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.082079 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.085003 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.085072 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="0f1d4e0b-4c49-4323-b3a7-48363d831f2b" containerName="nova-cell1-conductor-conductor" Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.157177 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.160132 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.161472 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.161510 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e4ccca7b-584b-4aa9-badd-0438284cfa51" containerName="nova-cell0-conductor-conductor" Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.567799 5127 generic.go:334] "Generic (PLEG): container finished" podID="0f1d4e0b-4c49-4323-b3a7-48363d831f2b" containerID="d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c" exitCode=0 Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.567847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f1d4e0b-4c49-4323-b3a7-48363d831f2b","Type":"ContainerDied","Data":"d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c"} Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.816525 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.816781 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-log" containerID="cri-o://03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533" gracePeriod=30 Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.816855 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-api" containerID="cri-o://c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb" gracePeriod=30 Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.833244 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.833495 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ec2f810d-1f20-4378-ba82-cb5630da7544" containerName="nova-scheduler-scheduler" containerID="cri-o://799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" gracePeriod=30 Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.842820 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.843050 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-log" containerID="cri-o://a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91" gracePeriod=30 Feb 01 09:32:18 crc kubenswrapper[5127]: I0201 09:32:18.843146 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-metadata" containerID="cri-o://4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c" gracePeriod=30 Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.896761 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.907202 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.908397 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 01 09:32:18 crc kubenswrapper[5127]: E0201 09:32:18.908449 5127 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ec2f810d-1f20-4378-ba82-cb5630da7544" containerName="nova-scheduler-scheduler" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.325860 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.419811 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4q7v\" (UniqueName: \"kubernetes.io/projected/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-kube-api-access-j4q7v\") pod \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.420007 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-combined-ca-bundle\") pod \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.420150 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-config-data\") pod \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\" (UID: \"0f1d4e0b-4c49-4323-b3a7-48363d831f2b\") " Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.426393 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-kube-api-access-j4q7v" (OuterVolumeSpecName: "kube-api-access-j4q7v") pod "0f1d4e0b-4c49-4323-b3a7-48363d831f2b" (UID: "0f1d4e0b-4c49-4323-b3a7-48363d831f2b"). InnerVolumeSpecName "kube-api-access-j4q7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.449792 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-config-data" (OuterVolumeSpecName: "config-data") pod "0f1d4e0b-4c49-4323-b3a7-48363d831f2b" (UID: "0f1d4e0b-4c49-4323-b3a7-48363d831f2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.451181 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f1d4e0b-4c49-4323-b3a7-48363d831f2b" (UID: "0f1d4e0b-4c49-4323-b3a7-48363d831f2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.523964 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.523998 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.524010 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4q7v\" (UniqueName: \"kubernetes.io/projected/0f1d4e0b-4c49-4323-b3a7-48363d831f2b-kube-api-access-j4q7v\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.579645 5127 generic.go:334] "Generic (PLEG): container finished" podID="5b561302-0463-490e-a011-e508d0f4e612" containerID="a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91" exitCode=143 Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.579693 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b561302-0463-490e-a011-e508d0f4e612","Type":"ContainerDied","Data":"a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91"} Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.582246 5127 generic.go:334] "Generic (PLEG): container finished" podID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerID="03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533" exitCode=143 Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.582325 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d13d57-17e3-4d77-8cfe-30c383444cf7","Type":"ContainerDied","Data":"03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533"} Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.584888 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0f1d4e0b-4c49-4323-b3a7-48363d831f2b","Type":"ContainerDied","Data":"b3ccb39f4383ab9f59bf89dcfa6fce6b12cf351501be5cbef267e9d9af71b089"} Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.584943 5127 scope.go:117] "RemoveContainer" containerID="d4c5876f23d1811fa41293106d83a42da0214d4851649d7a2c883ec5b9f0718c" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.584950 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.623418 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.641107 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.655429 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 09:32:19 crc kubenswrapper[5127]: E0201 09:32:19.656059 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a1da379-3eb9-4fa1-abe1-aabf0654833c" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.656080 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a1da379-3eb9-4fa1-abe1-aabf0654833c" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 01 09:32:19 crc kubenswrapper[5127]: E0201 09:32:19.656532 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1d4e0b-4c49-4323-b3a7-48363d831f2b" containerName="nova-cell1-conductor-conductor" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.656827 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1d4e0b-4c49-4323-b3a7-48363d831f2b" containerName="nova-cell1-conductor-conductor" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.658070 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a1da379-3eb9-4fa1-abe1-aabf0654833c" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.658135 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f1d4e0b-4c49-4323-b3a7-48363d831f2b" containerName="nova-cell1-conductor-conductor" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.660224 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.674231 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.693031 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.727374 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1c414-d0df-4128-9b09-b5a2028f3454-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.727431 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1c414-d0df-4128-9b09-b5a2028f3454-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.727461 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mw5k\" (UniqueName: \"kubernetes.io/projected/f0f1c414-d0df-4128-9b09-b5a2028f3454-kube-api-access-5mw5k\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.829091 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1c414-d0df-4128-9b09-b5a2028f3454-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.829146 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1c414-d0df-4128-9b09-b5a2028f3454-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.829170 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mw5k\" (UniqueName: \"kubernetes.io/projected/f0f1c414-d0df-4128-9b09-b5a2028f3454-kube-api-access-5mw5k\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.833671 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0f1c414-d0df-4128-9b09-b5a2028f3454-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.836114 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0f1c414-d0df-4128-9b09-b5a2028f3454-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:19 crc kubenswrapper[5127]: I0201 09:32:19.852755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mw5k\" (UniqueName: \"kubernetes.io/projected/f0f1c414-d0df-4128-9b09-b5a2028f3454-kube-api-access-5mw5k\") pod \"nova-cell1-conductor-0\" (UID: \"f0f1c414-d0df-4128-9b09-b5a2028f3454\") " pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.001802 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.056376 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.141393 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh85k\" (UniqueName: \"kubernetes.io/projected/ec2f810d-1f20-4378-ba82-cb5630da7544-kube-api-access-gh85k\") pod \"ec2f810d-1f20-4378-ba82-cb5630da7544\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.141523 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-config-data\") pod \"ec2f810d-1f20-4378-ba82-cb5630da7544\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.141659 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-combined-ca-bundle\") pod \"ec2f810d-1f20-4378-ba82-cb5630da7544\" (UID: \"ec2f810d-1f20-4378-ba82-cb5630da7544\") " Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.146039 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec2f810d-1f20-4378-ba82-cb5630da7544-kube-api-access-gh85k" (OuterVolumeSpecName: "kube-api-access-gh85k") pod "ec2f810d-1f20-4378-ba82-cb5630da7544" (UID: "ec2f810d-1f20-4378-ba82-cb5630da7544"). InnerVolumeSpecName "kube-api-access-gh85k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.181362 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec2f810d-1f20-4378-ba82-cb5630da7544" (UID: "ec2f810d-1f20-4378-ba82-cb5630da7544"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.195666 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-config-data" (OuterVolumeSpecName: "config-data") pod "ec2f810d-1f20-4378-ba82-cb5630da7544" (UID: "ec2f810d-1f20-4378-ba82-cb5630da7544"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.243931 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.244177 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh85k\" (UniqueName: \"kubernetes.io/projected/ec2f810d-1f20-4378-ba82-cb5630da7544-kube-api-access-gh85k\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.244189 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec2f810d-1f20-4378-ba82-cb5630da7544-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.249862 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f1d4e0b-4c49-4323-b3a7-48363d831f2b" path="/var/lib/kubelet/pods/0f1d4e0b-4c49-4323-b3a7-48363d831f2b/volumes" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.548486 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.608760 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f0f1c414-d0df-4128-9b09-b5a2028f3454","Type":"ContainerStarted","Data":"d21a23607fcd733afac6792887d0da76800d0ac911ce79454f586638ebde827b"} Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.630751 5127 generic.go:334] "Generic (PLEG): container finished" podID="ec2f810d-1f20-4378-ba82-cb5630da7544" containerID="799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" exitCode=0 Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.630825 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec2f810d-1f20-4378-ba82-cb5630da7544","Type":"ContainerDied","Data":"799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c"} Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.630851 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ec2f810d-1f20-4378-ba82-cb5630da7544","Type":"ContainerDied","Data":"2170a4ab77296e373c7657d42102a4a36642790258eec50d5dda4b89a8c53ad7"} Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.630868 5127 scope.go:117] "RemoveContainer" containerID="799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.630983 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.691252 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.709622 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.730688 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 09:32:20 crc kubenswrapper[5127]: E0201 09:32:20.731263 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec2f810d-1f20-4378-ba82-cb5630da7544" containerName="nova-scheduler-scheduler" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.731281 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec2f810d-1f20-4378-ba82-cb5630da7544" containerName="nova-scheduler-scheduler" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.731551 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec2f810d-1f20-4378-ba82-cb5630da7544" containerName="nova-scheduler-scheduler" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.732373 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.739356 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.746224 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.754769 5127 scope.go:117] "RemoveContainer" containerID="799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.756215 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsn5k\" (UniqueName: \"kubernetes.io/projected/6063c9cb-f98d-44a8-863d-0ac61cd4257c-kube-api-access-nsn5k\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.756303 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6063c9cb-f98d-44a8-863d-0ac61cd4257c-config-data\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.756328 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6063c9cb-f98d-44a8-863d-0ac61cd4257c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: E0201 09:32:20.767806 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c\": container with ID starting with 799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c not found: ID does not exist" containerID="799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.767847 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c"} err="failed to get container status \"799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c\": rpc error: code = NotFound desc = could not find container \"799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c\": container with ID starting with 799481e077fc33e966219b5e99d98bb437ea1d831e71d061a39e33fba19c4d0c not found: ID does not exist" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.858438 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsn5k\" (UniqueName: \"kubernetes.io/projected/6063c9cb-f98d-44a8-863d-0ac61cd4257c-kube-api-access-nsn5k\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.858831 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6063c9cb-f98d-44a8-863d-0ac61cd4257c-config-data\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.858855 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6063c9cb-f98d-44a8-863d-0ac61cd4257c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.862683 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6063c9cb-f98d-44a8-863d-0ac61cd4257c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.875505 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsn5k\" (UniqueName: \"kubernetes.io/projected/6063c9cb-f98d-44a8-863d-0ac61cd4257c-kube-api-access-nsn5k\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:20 crc kubenswrapper[5127]: I0201 09:32:20.879008 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6063c9cb-f98d-44a8-863d-0ac61cd4257c-config-data\") pod \"nova-scheduler-0\" (UID: \"6063c9cb-f98d-44a8-863d-0ac61cd4257c\") " pod="openstack/nova-scheduler-0" Feb 01 09:32:21 crc kubenswrapper[5127]: I0201 09:32:21.072244 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 01 09:32:21 crc kubenswrapper[5127]: W0201 09:32:21.618669 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6063c9cb_f98d_44a8_863d_0ac61cd4257c.slice/crio-badf880be3b9e51fee59f636e86b67d6d062af055bb828eafbd4290942a36733 WatchSource:0}: Error finding container badf880be3b9e51fee59f636e86b67d6d062af055bb828eafbd4290942a36733: Status 404 returned error can't find the container with id badf880be3b9e51fee59f636e86b67d6d062af055bb828eafbd4290942a36733 Feb 01 09:32:21 crc kubenswrapper[5127]: I0201 09:32:21.622352 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 01 09:32:21 crc kubenswrapper[5127]: I0201 09:32:21.666113 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f0f1c414-d0df-4128-9b09-b5a2028f3454","Type":"ContainerStarted","Data":"9cc7b2a44deb50d67774f792860d6a74898e0e5ad4f643f25045efc5c706b3e0"} Feb 01 09:32:21 crc kubenswrapper[5127]: I0201 09:32:21.666780 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:21 crc kubenswrapper[5127]: I0201 09:32:21.676499 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6063c9cb-f98d-44a8-863d-0ac61cd4257c","Type":"ContainerStarted","Data":"badf880be3b9e51fee59f636e86b67d6d062af055bb828eafbd4290942a36733"} Feb 01 09:32:21 crc kubenswrapper[5127]: I0201 09:32:21.704561 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.704543888 podStartE2EDuration="2.704543888s" podCreationTimestamp="2026-02-01 09:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:32:21.6930474 +0000 UTC m=+9892.178949773" watchObservedRunningTime="2026-02-01 09:32:21.704543888 +0000 UTC m=+9892.190446251" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.163100 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.93:8775/\": dial tcp 10.217.1.93:8775: connect: connection refused" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.163134 5127 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.93:8775/\": dial tcp 10.217.1.93:8775: connect: connection refused" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.245719 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec2f810d-1f20-4378-ba82-cb5630da7544" path="/var/lib/kubelet/pods/ec2f810d-1f20-4378-ba82-cb5630da7544/volumes" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.566451 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.570827 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606168 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptjl9\" (UniqueName: \"kubernetes.io/projected/5b561302-0463-490e-a011-e508d0f4e612-kube-api-access-ptjl9\") pod \"5b561302-0463-490e-a011-e508d0f4e612\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606270 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b561302-0463-490e-a011-e508d0f4e612-logs\") pod \"5b561302-0463-490e-a011-e508d0f4e612\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606333 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d13d57-17e3-4d77-8cfe-30c383444cf7-logs\") pod \"47d13d57-17e3-4d77-8cfe-30c383444cf7\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606351 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-config-data\") pod \"5b561302-0463-490e-a011-e508d0f4e612\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606393 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-config-data\") pod \"47d13d57-17e3-4d77-8cfe-30c383444cf7\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606459 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-combined-ca-bundle\") pod \"47d13d57-17e3-4d77-8cfe-30c383444cf7\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606501 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd8mz\" (UniqueName: \"kubernetes.io/projected/47d13d57-17e3-4d77-8cfe-30c383444cf7-kube-api-access-jd8mz\") pod \"47d13d57-17e3-4d77-8cfe-30c383444cf7\" (UID: \"47d13d57-17e3-4d77-8cfe-30c383444cf7\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606551 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-combined-ca-bundle\") pod \"5b561302-0463-490e-a011-e508d0f4e612\" (UID: \"5b561302-0463-490e-a011-e508d0f4e612\") " Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606794 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b561302-0463-490e-a011-e508d0f4e612-logs" (OuterVolumeSpecName: "logs") pod "5b561302-0463-490e-a011-e508d0f4e612" (UID: "5b561302-0463-490e-a011-e508d0f4e612"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.606985 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5b561302-0463-490e-a011-e508d0f4e612-logs\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.607691 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47d13d57-17e3-4d77-8cfe-30c383444cf7-logs" (OuterVolumeSpecName: "logs") pod "47d13d57-17e3-4d77-8cfe-30c383444cf7" (UID: "47d13d57-17e3-4d77-8cfe-30c383444cf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.617752 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b561302-0463-490e-a011-e508d0f4e612-kube-api-access-ptjl9" (OuterVolumeSpecName: "kube-api-access-ptjl9") pod "5b561302-0463-490e-a011-e508d0f4e612" (UID: "5b561302-0463-490e-a011-e508d0f4e612"). InnerVolumeSpecName "kube-api-access-ptjl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.632622 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d13d57-17e3-4d77-8cfe-30c383444cf7-kube-api-access-jd8mz" (OuterVolumeSpecName: "kube-api-access-jd8mz") pod "47d13d57-17e3-4d77-8cfe-30c383444cf7" (UID: "47d13d57-17e3-4d77-8cfe-30c383444cf7"). InnerVolumeSpecName "kube-api-access-jd8mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.658227 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b561302-0463-490e-a011-e508d0f4e612" (UID: "5b561302-0463-490e-a011-e508d0f4e612"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.663197 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47d13d57-17e3-4d77-8cfe-30c383444cf7" (UID: "47d13d57-17e3-4d77-8cfe-30c383444cf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.700913 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6063c9cb-f98d-44a8-863d-0ac61cd4257c","Type":"ContainerStarted","Data":"c2b53fd8f981c9d71dfbfbb0ac19802bc3512892f1b5c32f0bfb40dcf30c2dee"} Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.701351 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-config-data" (OuterVolumeSpecName: "config-data") pod "5b561302-0463-490e-a011-e508d0f4e612" (UID: "5b561302-0463-490e-a011-e508d0f4e612"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.701394 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-config-data" (OuterVolumeSpecName: "config-data") pod "47d13d57-17e3-4d77-8cfe-30c383444cf7" (UID: "47d13d57-17e3-4d77-8cfe-30c383444cf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.704218 5127 generic.go:334] "Generic (PLEG): container finished" podID="5b561302-0463-490e-a011-e508d0f4e612" containerID="4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c" exitCode=0 Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.704282 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b561302-0463-490e-a011-e508d0f4e612","Type":"ContainerDied","Data":"4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c"} Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.704309 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5b561302-0463-490e-a011-e508d0f4e612","Type":"ContainerDied","Data":"a119009c5f8a8b3043bd7c09bed62cfda0b2f60bea00e62c344e10a9b968355a"} Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.704327 5127 scope.go:117] "RemoveContainer" containerID="4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.704418 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.709476 5127 generic.go:334] "Generic (PLEG): container finished" podID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerID="c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb" exitCode=0 Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.709673 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d13d57-17e3-4d77-8cfe-30c383444cf7","Type":"ContainerDied","Data":"c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb"} Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.709709 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"47d13d57-17e3-4d77-8cfe-30c383444cf7","Type":"ContainerDied","Data":"b800c4007e59115e6594a105c1f77b232d5e51975918eb599ec776f4b09879bb"} Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.711333 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.711355 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptjl9\" (UniqueName: \"kubernetes.io/projected/5b561302-0463-490e-a011-e508d0f4e612-kube-api-access-ptjl9\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.711367 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b561302-0463-490e-a011-e508d0f4e612-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.711376 5127 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47d13d57-17e3-4d77-8cfe-30c383444cf7-logs\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.711384 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.711392 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47d13d57-17e3-4d77-8cfe-30c383444cf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.711400 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd8mz\" (UniqueName: \"kubernetes.io/projected/47d13d57-17e3-4d77-8cfe-30c383444cf7-kube-api-access-jd8mz\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.714204 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.738633 5127 scope.go:117] "RemoveContainer" containerID="a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.754286 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.754267422 podStartE2EDuration="2.754267422s" podCreationTimestamp="2026-02-01 09:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:32:22.722619291 +0000 UTC m=+9893.208521644" watchObservedRunningTime="2026-02-01 09:32:22.754267422 +0000 UTC m=+9893.240169785" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.766060 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.808760 5127 scope.go:117] "RemoveContainer" containerID="4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c" Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.818283 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c\": container with ID starting with 4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c not found: ID does not exist" containerID="4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.818343 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c"} err="failed to get container status \"4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c\": rpc error: code = NotFound desc = could not find container \"4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c\": container with ID starting with 4a7d4a554c67ce570bb867a4c658d96d0238b2b0cd0a9f23d395a8ffc569ae7c not found: ID does not exist" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.818395 5127 scope.go:117] "RemoveContainer" containerID="a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91" Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.818849 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91\": container with ID starting with a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91 not found: ID does not exist" containerID="a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.818894 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91"} err="failed to get container status \"a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91\": rpc error: code = NotFound desc = could not find container \"a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91\": container with ID starting with a1feb8af59e6f6858ea4f066bdaccf0b90af4bd3b98ac269fd016b90aa0b1c91 not found: ID does not exist" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.818921 5127 scope.go:117] "RemoveContainer" containerID="c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.853687 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.856355 5127 scope.go:117] "RemoveContainer" containerID="03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.867363 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.867855 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-log" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.867876 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-log" Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.867892 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-api" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.867899 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-api" Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.867914 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-metadata" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.867920 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-metadata" Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.867939 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-log" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.867944 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-log" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.868134 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-log" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.868150 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-log" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.868168 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" containerName="nova-api-api" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.868178 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b561302-0463-490e-a011-e508d0f4e612" containerName="nova-metadata-metadata" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.869542 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.872561 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.888373 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.896599 5127 scope.go:117] "RemoveContainer" containerID="c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb" Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.896974 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb\": container with ID starting with c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb not found: ID does not exist" containerID="c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.897057 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb"} err="failed to get container status \"c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb\": rpc error: code = NotFound desc = could not find container \"c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb\": container with ID starting with c1a4ed0cdad3620eadd19737f5587789c85af29bddf3ad4d3d8e7089738638bb not found: ID does not exist" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.897141 5127 scope.go:117] "RemoveContainer" containerID="03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533" Feb 01 09:32:22 crc kubenswrapper[5127]: E0201 09:32:22.897373 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533\": container with ID starting with 03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533 not found: ID does not exist" containerID="03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.897443 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533"} err="failed to get container status \"03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533\": rpc error: code = NotFound desc = could not find container \"03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533\": container with ID starting with 03c818b4522aea606c45aebad720d684135d9e26eb76bb8cf2ce59acde686533 not found: ID does not exist" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.897554 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.910635 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.917114 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf60a1d-4462-418a-8ee6-23da577357fe-config-data\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.917418 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wd7f\" (UniqueName: \"kubernetes.io/projected/dbf60a1d-4462-418a-8ee6-23da577357fe-kube-api-access-2wd7f\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.917485 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf60a1d-4462-418a-8ee6-23da577357fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.917801 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbf60a1d-4462-418a-8ee6-23da577357fe-logs\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.931751 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.933522 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.936118 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 01 09:32:22 crc kubenswrapper[5127]: I0201 09:32:22.957259 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.022594 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zzs\" (UniqueName: \"kubernetes.io/projected/df602c4b-e8eb-4c9e-b855-6196b51eebe5-kube-api-access-s9zzs\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.022639 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df602c4b-e8eb-4c9e-b855-6196b51eebe5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.022763 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf60a1d-4462-418a-8ee6-23da577357fe-config-data\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.022793 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wd7f\" (UniqueName: \"kubernetes.io/projected/dbf60a1d-4462-418a-8ee6-23da577357fe-kube-api-access-2wd7f\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.022853 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df602c4b-e8eb-4c9e-b855-6196b51eebe5-config-data\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.022893 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf60a1d-4462-418a-8ee6-23da577357fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.023006 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbf60a1d-4462-418a-8ee6-23da577357fe-logs\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.023032 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df602c4b-e8eb-4c9e-b855-6196b51eebe5-logs\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.026957 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbf60a1d-4462-418a-8ee6-23da577357fe-config-data\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.027217 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dbf60a1d-4462-418a-8ee6-23da577357fe-logs\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.029030 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbf60a1d-4462-418a-8ee6-23da577357fe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.041114 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wd7f\" (UniqueName: \"kubernetes.io/projected/dbf60a1d-4462-418a-8ee6-23da577357fe-kube-api-access-2wd7f\") pod \"nova-api-0\" (UID: \"dbf60a1d-4462-418a-8ee6-23da577357fe\") " pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.124912 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zzs\" (UniqueName: \"kubernetes.io/projected/df602c4b-e8eb-4c9e-b855-6196b51eebe5-kube-api-access-s9zzs\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.124968 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df602c4b-e8eb-4c9e-b855-6196b51eebe5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.125081 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df602c4b-e8eb-4c9e-b855-6196b51eebe5-config-data\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.125166 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df602c4b-e8eb-4c9e-b855-6196b51eebe5-logs\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.125606 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df602c4b-e8eb-4c9e-b855-6196b51eebe5-logs\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.132299 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df602c4b-e8eb-4c9e-b855-6196b51eebe5-config-data\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.137984 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df602c4b-e8eb-4c9e-b855-6196b51eebe5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.140286 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zzs\" (UniqueName: \"kubernetes.io/projected/df602c4b-e8eb-4c9e-b855-6196b51eebe5-kube-api-access-s9zzs\") pod \"nova-metadata-0\" (UID: \"df602c4b-e8eb-4c9e-b855-6196b51eebe5\") " pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: E0201 09:32:23.155758 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013 is running failed: container process not found" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:23 crc kubenswrapper[5127]: E0201 09:32:23.156525 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013 is running failed: container process not found" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:23 crc kubenswrapper[5127]: E0201 09:32:23.156835 5127 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013 is running failed: container process not found" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 01 09:32:23 crc kubenswrapper[5127]: E0201 09:32:23.156869 5127 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="e4ccca7b-584b-4aa9-badd-0438284cfa51" containerName="nova-cell0-conductor-conductor" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.188781 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.189839 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.256169 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.337297 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-config-data\") pod \"e4ccca7b-584b-4aa9-badd-0438284cfa51\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.337764 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-combined-ca-bundle\") pod \"e4ccca7b-584b-4aa9-badd-0438284cfa51\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.337982 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmssb\" (UniqueName: \"kubernetes.io/projected/e4ccca7b-584b-4aa9-badd-0438284cfa51-kube-api-access-rmssb\") pod \"e4ccca7b-584b-4aa9-badd-0438284cfa51\" (UID: \"e4ccca7b-584b-4aa9-badd-0438284cfa51\") " Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.343480 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ccca7b-584b-4aa9-badd-0438284cfa51-kube-api-access-rmssb" (OuterVolumeSpecName: "kube-api-access-rmssb") pod "e4ccca7b-584b-4aa9-badd-0438284cfa51" (UID: "e4ccca7b-584b-4aa9-badd-0438284cfa51"). InnerVolumeSpecName "kube-api-access-rmssb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.368592 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-config-data" (OuterVolumeSpecName: "config-data") pod "e4ccca7b-584b-4aa9-badd-0438284cfa51" (UID: "e4ccca7b-584b-4aa9-badd-0438284cfa51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.378223 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ccca7b-584b-4aa9-badd-0438284cfa51" (UID: "e4ccca7b-584b-4aa9-badd-0438284cfa51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.440034 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmssb\" (UniqueName: \"kubernetes.io/projected/e4ccca7b-584b-4aa9-badd-0438284cfa51-kube-api-access-rmssb\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.440076 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.440086 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ccca7b-584b-4aa9-badd-0438284cfa51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.689097 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.722856 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dbf60a1d-4462-418a-8ee6-23da577357fe","Type":"ContainerStarted","Data":"883f2a9235779bcfc24d544305a69c8643d3e952fab252b4d806b98df8a3b576"} Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.724792 5127 generic.go:334] "Generic (PLEG): container finished" podID="e4ccca7b-584b-4aa9-badd-0438284cfa51" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" exitCode=0 Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.724862 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e4ccca7b-584b-4aa9-badd-0438284cfa51","Type":"ContainerDied","Data":"f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013"} Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.724895 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e4ccca7b-584b-4aa9-badd-0438284cfa51","Type":"ContainerDied","Data":"cd61e0d69165ac8490d03a543348f99537a89d8ceda4077e136b2b87a21153e0"} Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.724916 5127 scope.go:117] "RemoveContainer" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.725033 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.774069 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 01 09:32:23 crc kubenswrapper[5127]: W0201 09:32:23.775730 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf602c4b_e8eb_4c9e_b855_6196b51eebe5.slice/crio-9ba6f6c4a314390259344dbde19752435c54bd5033f6954033bc42e3a28c13a8 WatchSource:0}: Error finding container 9ba6f6c4a314390259344dbde19752435c54bd5033f6954033bc42e3a28c13a8: Status 404 returned error can't find the container with id 9ba6f6c4a314390259344dbde19752435c54bd5033f6954033bc42e3a28c13a8 Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.878253 5127 scope.go:117] "RemoveContainer" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" Feb 01 09:32:23 crc kubenswrapper[5127]: E0201 09:32:23.878789 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013\": container with ID starting with f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013 not found: ID does not exist" containerID="f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.878848 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013"} err="failed to get container status \"f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013\": rpc error: code = NotFound desc = could not find container \"f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013\": container with ID starting with f567b055d82e4a308c558d69e63ac264d8d9b44358520a189d95b6c1b590a013 not found: ID does not exist" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.903213 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.912760 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.936914 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 09:32:23 crc kubenswrapper[5127]: E0201 09:32:23.937368 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ccca7b-584b-4aa9-badd-0438284cfa51" containerName="nova-cell0-conductor-conductor" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.937384 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ccca7b-584b-4aa9-badd-0438284cfa51" containerName="nova-cell0-conductor-conductor" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.937556 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ccca7b-584b-4aa9-badd-0438284cfa51" containerName="nova-cell0-conductor-conductor" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.938303 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.940759 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.948474 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.948705 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhtk9\" (UniqueName: \"kubernetes.io/projected/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-kube-api-access-nhtk9\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.948825 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:23 crc kubenswrapper[5127]: I0201 09:32:23.964423 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.051179 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.051280 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhtk9\" (UniqueName: \"kubernetes.io/projected/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-kube-api-access-nhtk9\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.051329 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.056754 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.056982 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.081154 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhtk9\" (UniqueName: \"kubernetes.io/projected/ab1f6e96-255f-4472-b0ac-1a712d4b40a2-kube-api-access-nhtk9\") pod \"nova-cell0-conductor-0\" (UID: \"ab1f6e96-255f-4472-b0ac-1a712d4b40a2\") " pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.245611 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47d13d57-17e3-4d77-8cfe-30c383444cf7" path="/var/lib/kubelet/pods/47d13d57-17e3-4d77-8cfe-30c383444cf7/volumes" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.246380 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b561302-0463-490e-a011-e508d0f4e612" path="/var/lib/kubelet/pods/5b561302-0463-490e-a011-e508d0f4e612/volumes" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.247224 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ccca7b-584b-4aa9-badd-0438284cfa51" path="/var/lib/kubelet/pods/e4ccca7b-584b-4aa9-badd-0438284cfa51/volumes" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.272659 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.748353 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.766895 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df602c4b-e8eb-4c9e-b855-6196b51eebe5","Type":"ContainerStarted","Data":"d20d80573079c222d2a60332639e5eef6588a4a44c01acad2dd54a96667cf10d"} Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.766938 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df602c4b-e8eb-4c9e-b855-6196b51eebe5","Type":"ContainerStarted","Data":"a58b52cac2020eee5dc46a93985a1cda5c17df0a512f568003840b475b24d7e1"} Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.766953 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"df602c4b-e8eb-4c9e-b855-6196b51eebe5","Type":"ContainerStarted","Data":"9ba6f6c4a314390259344dbde19752435c54bd5033f6954033bc42e3a28c13a8"} Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.777666 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dbf60a1d-4462-418a-8ee6-23da577357fe","Type":"ContainerStarted","Data":"b0dc977e47b57562fb82df47c2920c240a410afeeb17302e33777dd3ef2a6a81"} Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.777707 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dbf60a1d-4462-418a-8ee6-23da577357fe","Type":"ContainerStarted","Data":"e7a2a29d910443c8474630ee34ee49a13bab40625fb6ea50929c22666c266311"} Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.850415 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.850396669 podStartE2EDuration="2.850396669s" podCreationTimestamp="2026-02-01 09:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:32:24.845855317 +0000 UTC m=+9895.331757680" watchObservedRunningTime="2026-02-01 09:32:24.850396669 +0000 UTC m=+9895.336299032" Feb 01 09:32:24 crc kubenswrapper[5127]: I0201 09:32:24.852189 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.852183438 podStartE2EDuration="2.852183438s" podCreationTimestamp="2026-02-01 09:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:32:24.80688734 +0000 UTC m=+9895.292789703" watchObservedRunningTime="2026-02-01 09:32:24.852183438 +0000 UTC m=+9895.338085801" Feb 01 09:32:25 crc kubenswrapper[5127]: I0201 09:32:25.039826 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 01 09:32:25 crc kubenswrapper[5127]: I0201 09:32:25.794543 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ab1f6e96-255f-4472-b0ac-1a712d4b40a2","Type":"ContainerStarted","Data":"49c1917a091bb363a085df23f83cbfb279dd59bb7a1771c5dcd1baeba60061f0"} Feb 01 09:32:25 crc kubenswrapper[5127]: I0201 09:32:25.794933 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ab1f6e96-255f-4472-b0ac-1a712d4b40a2","Type":"ContainerStarted","Data":"e57fae9d39c1f7c37f14033af0b3cf694f4e0b7ab90f9cff84e3926515bdda40"} Feb 01 09:32:25 crc kubenswrapper[5127]: I0201 09:32:25.795567 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:25 crc kubenswrapper[5127]: I0201 09:32:25.839049 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.839019182 podStartE2EDuration="2.839019182s" podCreationTimestamp="2026-02-01 09:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 09:32:25.818312805 +0000 UTC m=+9896.304215168" watchObservedRunningTime="2026-02-01 09:32:25.839019182 +0000 UTC m=+9896.324921585" Feb 01 09:32:26 crc kubenswrapper[5127]: I0201 09:32:26.073960 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 01 09:32:27 crc kubenswrapper[5127]: I0201 09:32:27.236030 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:32:27 crc kubenswrapper[5127]: E0201 09:32:27.236990 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:32:28 crc kubenswrapper[5127]: I0201 09:32:28.256282 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 09:32:28 crc kubenswrapper[5127]: I0201 09:32:28.256405 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 01 09:32:29 crc kubenswrapper[5127]: I0201 09:32:29.312720 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 01 09:32:31 crc kubenswrapper[5127]: I0201 09:32:31.073125 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 01 09:32:31 crc kubenswrapper[5127]: I0201 09:32:31.159555 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 01 09:32:31 crc kubenswrapper[5127]: I0201 09:32:31.920963 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 01 09:32:33 crc kubenswrapper[5127]: I0201 09:32:33.189095 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 09:32:33 crc kubenswrapper[5127]: I0201 09:32:33.189423 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 01 09:32:33 crc kubenswrapper[5127]: I0201 09:32:33.256778 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 09:32:33 crc kubenswrapper[5127]: I0201 09:32:33.256834 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 01 09:32:34 crc kubenswrapper[5127]: I0201 09:32:34.271838 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dbf60a1d-4462-418a-8ee6-23da577357fe" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 09:32:34 crc kubenswrapper[5127]: I0201 09:32:34.271916 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dbf60a1d-4462-418a-8ee6-23da577357fe" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 09:32:34 crc kubenswrapper[5127]: I0201 09:32:34.360828 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df602c4b-e8eb-4c9e-b855-6196b51eebe5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.196:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 09:32:34 crc kubenswrapper[5127]: I0201 09:32:34.360828 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="df602c4b-e8eb-4c9e-b855-6196b51eebe5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.196:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 01 09:32:41 crc kubenswrapper[5127]: I0201 09:32:41.237149 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:32:42 crc kubenswrapper[5127]: I0201 09:32:42.026549 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"5ec2879d242ff56bb5def4fe5636c076b2aa81cf6a4f192db07a8f0fede3b2d3"} Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.201780 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.202248 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.202764 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.202818 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.207701 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.209012 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.265340 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.270829 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 09:32:43 crc kubenswrapper[5127]: I0201 09:32:43.271079 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 01 09:32:44 crc kubenswrapper[5127]: I0201 09:32:44.055500 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.600538 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz"] Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.602395 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.605134 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.605761 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.605938 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.606090 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.606119 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.606360 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qs48v" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.606481 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.629614 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz"] Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776072 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776122 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n42h9\" (UniqueName: \"kubernetes.io/projected/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-kube-api-access-n42h9\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776152 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776182 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776216 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776293 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776324 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776423 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776594 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776748 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.776839 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.878780 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.878845 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.878918 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.878985 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.879035 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.879088 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.879124 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n42h9\" (UniqueName: \"kubernetes.io/projected/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-kube-api-access-n42h9\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.879161 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.879199 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.879238 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.879406 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.880717 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.881305 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.887437 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.897351 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.897628 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.897673 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.897834 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.898072 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.898082 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.898492 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.901429 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n42h9\" (UniqueName: \"kubernetes.io/projected/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-kube-api-access-n42h9\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:45 crc kubenswrapper[5127]: I0201 09:32:45.931437 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:32:46 crc kubenswrapper[5127]: I0201 09:32:46.552641 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz"] Feb 01 09:32:46 crc kubenswrapper[5127]: W0201 09:32:46.553547 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bea9abe_b0b7_41af_bac0_07d4f854c1d6.slice/crio-a92e57c70bd02c571ef5cde4c1a571456b303c20d3979da37a5405a8a7cc909e WatchSource:0}: Error finding container a92e57c70bd02c571ef5cde4c1a571456b303c20d3979da37a5405a8a7cc909e: Status 404 returned error can't find the container with id a92e57c70bd02c571ef5cde4c1a571456b303c20d3979da37a5405a8a7cc909e Feb 01 09:32:47 crc kubenswrapper[5127]: I0201 09:32:47.105646 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" event={"ID":"7bea9abe-b0b7-41af-bac0-07d4f854c1d6","Type":"ContainerStarted","Data":"a92e57c70bd02c571ef5cde4c1a571456b303c20d3979da37a5405a8a7cc909e"} Feb 01 09:32:48 crc kubenswrapper[5127]: I0201 09:32:48.118655 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" event={"ID":"7bea9abe-b0b7-41af-bac0-07d4f854c1d6","Type":"ContainerStarted","Data":"4b19fdc989201a8cb9e82f1ad82252e61a60031c61a47715f7777cefd2e67816"} Feb 01 09:32:48 crc kubenswrapper[5127]: I0201 09:32:48.147529 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" podStartSLOduration=2.70882115 podStartE2EDuration="3.147510752s" podCreationTimestamp="2026-02-01 09:32:45 +0000 UTC" firstStartedPulling="2026-02-01 09:32:46.557672152 +0000 UTC m=+9917.043574535" lastFinishedPulling="2026-02-01 09:32:46.996361734 +0000 UTC m=+9917.482264137" observedRunningTime="2026-02-01 09:32:48.137398951 +0000 UTC m=+9918.623301314" watchObservedRunningTime="2026-02-01 09:32:48.147510752 +0000 UTC m=+9918.633413115" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.506742 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nzchq"] Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.512017 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.522105 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzchq"] Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.656650 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-utilities\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.656994 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7rvp\" (UniqueName: \"kubernetes.io/projected/b1143f15-3549-4848-a746-498f3d1df6f6-kube-api-access-p7rvp\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.657298 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-catalog-content\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.759497 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-catalog-content\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.759729 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-utilities\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.759931 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7rvp\" (UniqueName: \"kubernetes.io/projected/b1143f15-3549-4848-a746-498f3d1df6f6-kube-api-access-p7rvp\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.759979 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-catalog-content\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.760249 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-utilities\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.792621 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7rvp\" (UniqueName: \"kubernetes.io/projected/b1143f15-3549-4848-a746-498f3d1df6f6-kube-api-access-p7rvp\") pod \"redhat-operators-nzchq\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:46 crc kubenswrapper[5127]: I0201 09:34:46.834853 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:47 crc kubenswrapper[5127]: I0201 09:34:47.340154 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nzchq"] Feb 01 09:34:48 crc kubenswrapper[5127]: I0201 09:34:48.259043 5127 generic.go:334] "Generic (PLEG): container finished" podID="b1143f15-3549-4848-a746-498f3d1df6f6" containerID="23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072" exitCode=0 Feb 01 09:34:48 crc kubenswrapper[5127]: I0201 09:34:48.259208 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzchq" event={"ID":"b1143f15-3549-4848-a746-498f3d1df6f6","Type":"ContainerDied","Data":"23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072"} Feb 01 09:34:48 crc kubenswrapper[5127]: I0201 09:34:48.259367 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzchq" event={"ID":"b1143f15-3549-4848-a746-498f3d1df6f6","Type":"ContainerStarted","Data":"e667de891648dec31db61905d4831e97c1b4892defe9c2a0746e445765609399"} Feb 01 09:34:48 crc kubenswrapper[5127]: I0201 09:34:48.261729 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:34:49 crc kubenswrapper[5127]: I0201 09:34:49.275437 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzchq" event={"ID":"b1143f15-3549-4848-a746-498f3d1df6f6","Type":"ContainerStarted","Data":"674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21"} Feb 01 09:34:50 crc kubenswrapper[5127]: I0201 09:34:50.289774 5127 generic.go:334] "Generic (PLEG): container finished" podID="b1143f15-3549-4848-a746-498f3d1df6f6" containerID="674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21" exitCode=0 Feb 01 09:34:50 crc kubenswrapper[5127]: I0201 09:34:50.289882 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzchq" event={"ID":"b1143f15-3549-4848-a746-498f3d1df6f6","Type":"ContainerDied","Data":"674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21"} Feb 01 09:34:51 crc kubenswrapper[5127]: I0201 09:34:51.307408 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzchq" event={"ID":"b1143f15-3549-4848-a746-498f3d1df6f6","Type":"ContainerStarted","Data":"0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101"} Feb 01 09:34:51 crc kubenswrapper[5127]: I0201 09:34:51.345537 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nzchq" podStartSLOduration=2.861692257 podStartE2EDuration="5.345511657s" podCreationTimestamp="2026-02-01 09:34:46 +0000 UTC" firstStartedPulling="2026-02-01 09:34:48.261419434 +0000 UTC m=+10038.747321797" lastFinishedPulling="2026-02-01 09:34:50.745238824 +0000 UTC m=+10041.231141197" observedRunningTime="2026-02-01 09:34:51.337485341 +0000 UTC m=+10041.823387734" watchObservedRunningTime="2026-02-01 09:34:51.345511657 +0000 UTC m=+10041.831414060" Feb 01 09:34:56 crc kubenswrapper[5127]: I0201 09:34:56.835560 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:56 crc kubenswrapper[5127]: I0201 09:34:56.836295 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:34:57 crc kubenswrapper[5127]: I0201 09:34:57.920016 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nzchq" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="registry-server" probeResult="failure" output=< Feb 01 09:34:57 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:34:57 crc kubenswrapper[5127]: > Feb 01 09:35:06 crc kubenswrapper[5127]: I0201 09:35:06.741308 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:35:06 crc kubenswrapper[5127]: I0201 09:35:06.741946 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:35:06 crc kubenswrapper[5127]: I0201 09:35:06.904017 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:35:06 crc kubenswrapper[5127]: I0201 09:35:06.984245 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:35:07 crc kubenswrapper[5127]: I0201 09:35:07.142455 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzchq"] Feb 01 09:35:08 crc kubenswrapper[5127]: I0201 09:35:08.509315 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nzchq" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="registry-server" containerID="cri-o://0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101" gracePeriod=2 Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.137011 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.215484 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7rvp\" (UniqueName: \"kubernetes.io/projected/b1143f15-3549-4848-a746-498f3d1df6f6-kube-api-access-p7rvp\") pod \"b1143f15-3549-4848-a746-498f3d1df6f6\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.215761 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-utilities\") pod \"b1143f15-3549-4848-a746-498f3d1df6f6\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.215826 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-catalog-content\") pod \"b1143f15-3549-4848-a746-498f3d1df6f6\" (UID: \"b1143f15-3549-4848-a746-498f3d1df6f6\") " Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.217557 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-utilities" (OuterVolumeSpecName: "utilities") pod "b1143f15-3549-4848-a746-498f3d1df6f6" (UID: "b1143f15-3549-4848-a746-498f3d1df6f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.233392 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1143f15-3549-4848-a746-498f3d1df6f6-kube-api-access-p7rvp" (OuterVolumeSpecName: "kube-api-access-p7rvp") pod "b1143f15-3549-4848-a746-498f3d1df6f6" (UID: "b1143f15-3549-4848-a746-498f3d1df6f6"). InnerVolumeSpecName "kube-api-access-p7rvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.322245 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.322297 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7rvp\" (UniqueName: \"kubernetes.io/projected/b1143f15-3549-4848-a746-498f3d1df6f6-kube-api-access-p7rvp\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.409383 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1143f15-3549-4848-a746-498f3d1df6f6" (UID: "b1143f15-3549-4848-a746-498f3d1df6f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.424700 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1143f15-3549-4848-a746-498f3d1df6f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.519151 5127 generic.go:334] "Generic (PLEG): container finished" podID="b1143f15-3549-4848-a746-498f3d1df6f6" containerID="0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101" exitCode=0 Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.519194 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzchq" event={"ID":"b1143f15-3549-4848-a746-498f3d1df6f6","Type":"ContainerDied","Data":"0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101"} Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.519225 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nzchq" event={"ID":"b1143f15-3549-4848-a746-498f3d1df6f6","Type":"ContainerDied","Data":"e667de891648dec31db61905d4831e97c1b4892defe9c2a0746e445765609399"} Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.519245 5127 scope.go:117] "RemoveContainer" containerID="0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.519403 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nzchq" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.546228 5127 scope.go:117] "RemoveContainer" containerID="674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.557542 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nzchq"] Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.569794 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nzchq"] Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.604382 5127 scope.go:117] "RemoveContainer" containerID="23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.625343 5127 scope.go:117] "RemoveContainer" containerID="0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101" Feb 01 09:35:09 crc kubenswrapper[5127]: E0201 09:35:09.625739 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101\": container with ID starting with 0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101 not found: ID does not exist" containerID="0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.625783 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101"} err="failed to get container status \"0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101\": rpc error: code = NotFound desc = could not find container \"0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101\": container with ID starting with 0b76813da1092472f9c503fe759bd13cabcfa9d528c8669c97a20e005211f101 not found: ID does not exist" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.625811 5127 scope.go:117] "RemoveContainer" containerID="674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21" Feb 01 09:35:09 crc kubenswrapper[5127]: E0201 09:35:09.626149 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21\": container with ID starting with 674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21 not found: ID does not exist" containerID="674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.626178 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21"} err="failed to get container status \"674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21\": rpc error: code = NotFound desc = could not find container \"674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21\": container with ID starting with 674b3c51d8f862f78b7b4ac4837c2b8c6c158f62e788d513fb567e97e37a4e21 not found: ID does not exist" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.626263 5127 scope.go:117] "RemoveContainer" containerID="23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072" Feb 01 09:35:09 crc kubenswrapper[5127]: E0201 09:35:09.626559 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072\": container with ID starting with 23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072 not found: ID does not exist" containerID="23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072" Feb 01 09:35:09 crc kubenswrapper[5127]: I0201 09:35:09.626596 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072"} err="failed to get container status \"23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072\": rpc error: code = NotFound desc = could not find container \"23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072\": container with ID starting with 23ce01bd687e533a67c67f3a44f290b99e3c2ec4dd0c98d80381329932480072 not found: ID does not exist" Feb 01 09:35:10 crc kubenswrapper[5127]: I0201 09:35:10.253752 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" path="/var/lib/kubelet/pods/b1143f15-3549-4848-a746-498f3d1df6f6/volumes" Feb 01 09:35:36 crc kubenswrapper[5127]: I0201 09:35:36.740734 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:35:36 crc kubenswrapper[5127]: I0201 09:35:36.741468 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:35:37 crc kubenswrapper[5127]: I0201 09:35:37.897278 5127 generic.go:334] "Generic (PLEG): container finished" podID="7bea9abe-b0b7-41af-bac0-07d4f854c1d6" containerID="4b19fdc989201a8cb9e82f1ad82252e61a60031c61a47715f7777cefd2e67816" exitCode=0 Feb 01 09:35:37 crc kubenswrapper[5127]: I0201 09:35:37.897352 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" event={"ID":"7bea9abe-b0b7-41af-bac0-07d4f854c1d6","Type":"ContainerDied","Data":"4b19fdc989201a8cb9e82f1ad82252e61a60031c61a47715f7777cefd2e67816"} Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.462614 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.581221 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ceph\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.581268 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n42h9\" (UniqueName: \"kubernetes.io/projected/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-kube-api-access-n42h9\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.581318 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-combined-ca-bundle\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.581422 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ssh-key-openstack-cell1\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.581458 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-inventory\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.582362 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-0\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.582446 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-1\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.582471 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-0\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.582500 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-1\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.582551 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-0\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.582569 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-1\") pod \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\" (UID: \"7bea9abe-b0b7-41af-bac0-07d4f854c1d6\") " Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.589224 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ceph" (OuterVolumeSpecName: "ceph") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.605844 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.611138 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-kube-api-access-n42h9" (OuterVolumeSpecName: "kube-api-access-n42h9") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "kube-api-access-n42h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.612488 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.617914 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.621845 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.632725 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.634484 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.641957 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.645513 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.666297 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-inventory" (OuterVolumeSpecName: "inventory") pod "7bea9abe-b0b7-41af-bac0-07d4f854c1d6" (UID: "7bea9abe-b0b7-41af-bac0-07d4f854c1d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685019 5127 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685058 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685073 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685090 5127 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685145 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685159 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685174 5127 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ceph\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685187 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n42h9\" (UniqueName: \"kubernetes.io/projected/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-kube-api-access-n42h9\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685199 5127 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685259 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.685273 5127 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bea9abe-b0b7-41af-bac0-07d4f854c1d6-inventory\") on node \"crc\" DevicePath \"\"" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.939209 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" event={"ID":"7bea9abe-b0b7-41af-bac0-07d4f854c1d6","Type":"ContainerDied","Data":"a92e57c70bd02c571ef5cde4c1a571456b303c20d3979da37a5405a8a7cc909e"} Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.939274 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92e57c70bd02c571ef5cde4c1a571456b303c20d3979da37a5405a8a7cc909e" Feb 01 09:35:39 crc kubenswrapper[5127]: I0201 09:35:39.939306 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz" Feb 01 09:36:06 crc kubenswrapper[5127]: I0201 09:36:06.740943 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:36:06 crc kubenswrapper[5127]: I0201 09:36:06.741552 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:36:06 crc kubenswrapper[5127]: I0201 09:36:06.741638 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:36:06 crc kubenswrapper[5127]: I0201 09:36:06.742691 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ec2879d242ff56bb5def4fe5636c076b2aa81cf6a4f192db07a8f0fede3b2d3"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:36:06 crc kubenswrapper[5127]: I0201 09:36:06.742792 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://5ec2879d242ff56bb5def4fe5636c076b2aa81cf6a4f192db07a8f0fede3b2d3" gracePeriod=600 Feb 01 09:36:07 crc kubenswrapper[5127]: I0201 09:36:07.292796 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="5ec2879d242ff56bb5def4fe5636c076b2aa81cf6a4f192db07a8f0fede3b2d3" exitCode=0 Feb 01 09:36:07 crc kubenswrapper[5127]: I0201 09:36:07.293482 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"5ec2879d242ff56bb5def4fe5636c076b2aa81cf6a4f192db07a8f0fede3b2d3"} Feb 01 09:36:07 crc kubenswrapper[5127]: I0201 09:36:07.293522 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff"} Feb 01 09:36:07 crc kubenswrapper[5127]: I0201 09:36:07.293549 5127 scope.go:117] "RemoveContainer" containerID="615b1f214728b43f15061607774532327c57fe2dabb399bf9bfe06dd775caa19" Feb 01 09:37:10 crc kubenswrapper[5127]: I0201 09:37:10.071683 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 01 09:37:10 crc kubenswrapper[5127]: I0201 09:37:10.072422 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" containerName="adoption" containerID="cri-o://4bfbc89aa53e8543a4b32fc0aff3189e74ae543445c14b568152d24bfe27a12c" gracePeriod=30 Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.502125 5127 generic.go:334] "Generic (PLEG): container finished" podID="d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" containerID="4bfbc89aa53e8543a4b32fc0aff3189e74ae543445c14b568152d24bfe27a12c" exitCode=137 Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.502195 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef","Type":"ContainerDied","Data":"4bfbc89aa53e8543a4b32fc0aff3189e74ae543445c14b568152d24bfe27a12c"} Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.668303 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.746506 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\") pod \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.746624 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlhw7\" (UniqueName: \"kubernetes.io/projected/d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef-kube-api-access-rlhw7\") pod \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\" (UID: \"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef\") " Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.754417 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef-kube-api-access-rlhw7" (OuterVolumeSpecName: "kube-api-access-rlhw7") pod "d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" (UID: "d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef"). InnerVolumeSpecName "kube-api-access-rlhw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.775872 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183" (OuterVolumeSpecName: "mariadb-data") pod "d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" (UID: "d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef"). InnerVolumeSpecName "pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.849472 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\") on node \"crc\" " Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.849527 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlhw7\" (UniqueName: \"kubernetes.io/projected/d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef-kube-api-access-rlhw7\") on node \"crc\" DevicePath \"\"" Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.912396 5127 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.913170 5127 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183") on node "crc" Feb 01 09:37:40 crc kubenswrapper[5127]: I0201 09:37:40.952142 5127 reconciler_common.go:293] "Volume detached for volume \"pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e78186d1-81bb-4cd1-92a8-c077fe1eb183\") on node \"crc\" DevicePath \"\"" Feb 01 09:37:41 crc kubenswrapper[5127]: I0201 09:37:41.539055 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef","Type":"ContainerDied","Data":"3fff16be94bbc51e5a8d1867978d66f0f2c1c2b9f06154bce866bdac8f663ea1"} Feb 01 09:37:41 crc kubenswrapper[5127]: I0201 09:37:41.539156 5127 scope.go:117] "RemoveContainer" containerID="4bfbc89aa53e8543a4b32fc0aff3189e74ae543445c14b568152d24bfe27a12c" Feb 01 09:37:41 crc kubenswrapper[5127]: I0201 09:37:41.539170 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 01 09:37:41 crc kubenswrapper[5127]: I0201 09:37:41.598601 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 01 09:37:41 crc kubenswrapper[5127]: I0201 09:37:41.614649 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 01 09:37:42 crc kubenswrapper[5127]: I0201 09:37:42.256538 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" path="/var/lib/kubelet/pods/d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef/volumes" Feb 01 09:37:42 crc kubenswrapper[5127]: I0201 09:37:42.303866 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 01 09:37:42 crc kubenswrapper[5127]: I0201 09:37:42.304198 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" containerName="adoption" containerID="cri-o://a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b" gracePeriod=30 Feb 01 09:38:12 crc kubenswrapper[5127]: I0201 09:38:12.915646 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 01 09:38:12 crc kubenswrapper[5127]: I0201 09:38:12.973456 5127 generic.go:334] "Generic (PLEG): container finished" podID="56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" containerID="a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b" exitCode=137 Feb 01 09:38:12 crc kubenswrapper[5127]: I0201 09:38:12.973494 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b","Type":"ContainerDied","Data":"a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b"} Feb 01 09:38:12 crc kubenswrapper[5127]: I0201 09:38:12.973561 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b","Type":"ContainerDied","Data":"eececdd3e88f9e5870fe476afa5022c3cf4a1cd1c33ce566c5ce427378bece76"} Feb 01 09:38:12 crc kubenswrapper[5127]: I0201 09:38:12.973666 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 01 09:38:12 crc kubenswrapper[5127]: I0201 09:38:12.973744 5127 scope.go:117] "RemoveContainer" containerID="a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.001267 5127 scope.go:117] "RemoveContainer" containerID="a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b" Feb 01 09:38:13 crc kubenswrapper[5127]: E0201 09:38:13.002028 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b\": container with ID starting with a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b not found: ID does not exist" containerID="a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.002066 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b"} err="failed to get container status \"a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b\": rpc error: code = NotFound desc = could not find container \"a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b\": container with ID starting with a798977b328eac8b48ac5e81148fdd3b4e340cacb72c214c35a8003dfb1a759b not found: ID does not exist" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.034390 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-ovn-data-cert\") pod \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.035726 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\") pod \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.035917 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r674h\" (UniqueName: \"kubernetes.io/projected/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-kube-api-access-r674h\") pod \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\" (UID: \"56e811b5-c18f-4f40-8ef6-bd6a2f35a62b\") " Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.045827 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-kube-api-access-r674h" (OuterVolumeSpecName: "kube-api-access-r674h") pod "56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" (UID: "56e811b5-c18f-4f40-8ef6-bd6a2f35a62b"). InnerVolumeSpecName "kube-api-access-r674h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.049720 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" (UID: "56e811b5-c18f-4f40-8ef6-bd6a2f35a62b"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.062218 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3" (OuterVolumeSpecName: "ovn-data") pod "56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" (UID: "56e811b5-c18f-4f40-8ef6-bd6a2f35a62b"). InnerVolumeSpecName "pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.138490 5127 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.138615 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\") on node \"crc\" " Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.138655 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r674h\" (UniqueName: \"kubernetes.io/projected/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b-kube-api-access-r674h\") on node \"crc\" DevicePath \"\"" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.169290 5127 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.169664 5127 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3") on node "crc" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.240725 5127 reconciler_common.go:293] "Volume detached for volume \"pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-edeeadef-3c5d-4761-b083-9e2faf93cdd3\") on node \"crc\" DevicePath \"\"" Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.330306 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 01 09:38:13 crc kubenswrapper[5127]: I0201 09:38:13.349719 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 01 09:38:14 crc kubenswrapper[5127]: I0201 09:38:14.268929 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" path="/var/lib/kubelet/pods/56e811b5-c18f-4f40-8ef6-bd6a2f35a62b/volumes" Feb 01 09:38:36 crc kubenswrapper[5127]: I0201 09:38:36.740499 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:38:36 crc kubenswrapper[5127]: I0201 09:38:36.741266 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:39:06 crc kubenswrapper[5127]: I0201 09:39:06.740561 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:39:06 crc kubenswrapper[5127]: I0201 09:39:06.741402 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.592295 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dv9bl"] Feb 01 09:39:10 crc kubenswrapper[5127]: E0201 09:39:10.593344 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="extract-utilities" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593365 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="extract-utilities" Feb 01 09:39:10 crc kubenswrapper[5127]: E0201 09:39:10.593389 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="extract-content" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593397 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="extract-content" Feb 01 09:39:10 crc kubenswrapper[5127]: E0201 09:39:10.593425 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bea9abe-b0b7-41af-bac0-07d4f854c1d6" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593437 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bea9abe-b0b7-41af-bac0-07d4f854c1d6" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 01 09:39:10 crc kubenswrapper[5127]: E0201 09:39:10.593458 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="registry-server" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593467 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="registry-server" Feb 01 09:39:10 crc kubenswrapper[5127]: E0201 09:39:10.593482 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" containerName="adoption" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593490 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" containerName="adoption" Feb 01 09:39:10 crc kubenswrapper[5127]: E0201 09:39:10.593507 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" containerName="adoption" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593515 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" containerName="adoption" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593755 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="d070ee8f-e64c-4b3b-ae49-8f0cac53a9ef" containerName="adoption" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593779 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bea9abe-b0b7-41af-bac0-07d4f854c1d6" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593803 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1143f15-3549-4848-a746-498f3d1df6f6" containerName="registry-server" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.593815 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e811b5-c18f-4f40-8ef6-bd6a2f35a62b" containerName="adoption" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.596236 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.621230 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dv9bl"] Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.636151 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rhr\" (UniqueName: \"kubernetes.io/projected/95ce9244-f7ed-4182-b002-77c0ca7706fc-kube-api-access-w2rhr\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.636345 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-utilities\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.646532 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-catalog-content\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.749371 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-catalog-content\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.749615 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rhr\" (UniqueName: \"kubernetes.io/projected/95ce9244-f7ed-4182-b002-77c0ca7706fc-kube-api-access-w2rhr\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.749709 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-utilities\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.750228 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-catalog-content\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.750566 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-utilities\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.777052 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rhr\" (UniqueName: \"kubernetes.io/projected/95ce9244-f7ed-4182-b002-77c0ca7706fc-kube-api-access-w2rhr\") pod \"certified-operators-dv9bl\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:10 crc kubenswrapper[5127]: I0201 09:39:10.933861 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:11 crc kubenswrapper[5127]: I0201 09:39:11.501854 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dv9bl"] Feb 01 09:39:11 crc kubenswrapper[5127]: I0201 09:39:11.802674 5127 generic.go:334] "Generic (PLEG): container finished" podID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerID="806754732df57ac81b936d0791ae9071c8b1acce5dfda8feb65fb439ad82b093" exitCode=0 Feb 01 09:39:11 crc kubenswrapper[5127]: I0201 09:39:11.802847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9bl" event={"ID":"95ce9244-f7ed-4182-b002-77c0ca7706fc","Type":"ContainerDied","Data":"806754732df57ac81b936d0791ae9071c8b1acce5dfda8feb65fb439ad82b093"} Feb 01 09:39:11 crc kubenswrapper[5127]: I0201 09:39:11.803001 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9bl" event={"ID":"95ce9244-f7ed-4182-b002-77c0ca7706fc","Type":"ContainerStarted","Data":"f6b5ae25e9840d948d7af2049200b6c2a7bca49307d032a6e7fd1c1befc05b93"} Feb 01 09:39:12 crc kubenswrapper[5127]: I0201 09:39:12.812779 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9bl" event={"ID":"95ce9244-f7ed-4182-b002-77c0ca7706fc","Type":"ContainerStarted","Data":"968a4797eb18ea289c8529172ae0509d24257de958f6bf35c339bc1490db7a5f"} Feb 01 09:39:13 crc kubenswrapper[5127]: I0201 09:39:13.828329 5127 generic.go:334] "Generic (PLEG): container finished" podID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerID="968a4797eb18ea289c8529172ae0509d24257de958f6bf35c339bc1490db7a5f" exitCode=0 Feb 01 09:39:13 crc kubenswrapper[5127]: I0201 09:39:13.828387 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9bl" event={"ID":"95ce9244-f7ed-4182-b002-77c0ca7706fc","Type":"ContainerDied","Data":"968a4797eb18ea289c8529172ae0509d24257de958f6bf35c339bc1490db7a5f"} Feb 01 09:39:14 crc kubenswrapper[5127]: I0201 09:39:14.843623 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9bl" event={"ID":"95ce9244-f7ed-4182-b002-77c0ca7706fc","Type":"ContainerStarted","Data":"e6f13ec71facd14c537ba9a44a7da9394e7c9ec20617e8672fe0344fb9320a7c"} Feb 01 09:39:14 crc kubenswrapper[5127]: I0201 09:39:14.888930 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dv9bl" podStartSLOduration=2.49116723 podStartE2EDuration="4.888905662s" podCreationTimestamp="2026-02-01 09:39:10 +0000 UTC" firstStartedPulling="2026-02-01 09:39:11.805309535 +0000 UTC m=+10302.291211938" lastFinishedPulling="2026-02-01 09:39:14.203047967 +0000 UTC m=+10304.688950370" observedRunningTime="2026-02-01 09:39:14.872408128 +0000 UTC m=+10305.358310521" watchObservedRunningTime="2026-02-01 09:39:14.888905662 +0000 UTC m=+10305.374808055" Feb 01 09:39:20 crc kubenswrapper[5127]: I0201 09:39:20.934374 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:20 crc kubenswrapper[5127]: I0201 09:39:20.935055 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:21 crc kubenswrapper[5127]: I0201 09:39:21.024121 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:22 crc kubenswrapper[5127]: I0201 09:39:22.020867 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:22 crc kubenswrapper[5127]: I0201 09:39:22.096538 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dv9bl"] Feb 01 09:39:23 crc kubenswrapper[5127]: I0201 09:39:23.971324 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dv9bl" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="registry-server" containerID="cri-o://e6f13ec71facd14c537ba9a44a7da9394e7c9ec20617e8672fe0344fb9320a7c" gracePeriod=2 Feb 01 09:39:24 crc kubenswrapper[5127]: I0201 09:39:24.992260 5127 generic.go:334] "Generic (PLEG): container finished" podID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerID="e6f13ec71facd14c537ba9a44a7da9394e7c9ec20617e8672fe0344fb9320a7c" exitCode=0 Feb 01 09:39:24 crc kubenswrapper[5127]: I0201 09:39:24.992777 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9bl" event={"ID":"95ce9244-f7ed-4182-b002-77c0ca7706fc","Type":"ContainerDied","Data":"e6f13ec71facd14c537ba9a44a7da9394e7c9ec20617e8672fe0344fb9320a7c"} Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.180997 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.293932 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2rhr\" (UniqueName: \"kubernetes.io/projected/95ce9244-f7ed-4182-b002-77c0ca7706fc-kube-api-access-w2rhr\") pod \"95ce9244-f7ed-4182-b002-77c0ca7706fc\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.294092 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-utilities\") pod \"95ce9244-f7ed-4182-b002-77c0ca7706fc\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.294193 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-catalog-content\") pod \"95ce9244-f7ed-4182-b002-77c0ca7706fc\" (UID: \"95ce9244-f7ed-4182-b002-77c0ca7706fc\") " Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.295805 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-utilities" (OuterVolumeSpecName: "utilities") pod "95ce9244-f7ed-4182-b002-77c0ca7706fc" (UID: "95ce9244-f7ed-4182-b002-77c0ca7706fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.303186 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ce9244-f7ed-4182-b002-77c0ca7706fc-kube-api-access-w2rhr" (OuterVolumeSpecName: "kube-api-access-w2rhr") pod "95ce9244-f7ed-4182-b002-77c0ca7706fc" (UID: "95ce9244-f7ed-4182-b002-77c0ca7706fc"). InnerVolumeSpecName "kube-api-access-w2rhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.343503 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95ce9244-f7ed-4182-b002-77c0ca7706fc" (UID: "95ce9244-f7ed-4182-b002-77c0ca7706fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.397650 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2rhr\" (UniqueName: \"kubernetes.io/projected/95ce9244-f7ed-4182-b002-77c0ca7706fc-kube-api-access-w2rhr\") on node \"crc\" DevicePath \"\"" Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.397685 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:39:25 crc kubenswrapper[5127]: I0201 09:39:25.397697 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95ce9244-f7ed-4182-b002-77c0ca7706fc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.012979 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dv9bl" event={"ID":"95ce9244-f7ed-4182-b002-77c0ca7706fc","Type":"ContainerDied","Data":"f6b5ae25e9840d948d7af2049200b6c2a7bca49307d032a6e7fd1c1befc05b93"} Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.013068 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dv9bl" Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.013301 5127 scope.go:117] "RemoveContainer" containerID="e6f13ec71facd14c537ba9a44a7da9394e7c9ec20617e8672fe0344fb9320a7c" Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.046772 5127 scope.go:117] "RemoveContainer" containerID="968a4797eb18ea289c8529172ae0509d24257de958f6bf35c339bc1490db7a5f" Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.085775 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dv9bl"] Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.090134 5127 scope.go:117] "RemoveContainer" containerID="806754732df57ac81b936d0791ae9071c8b1acce5dfda8feb65fb439ad82b093" Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.098988 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dv9bl"] Feb 01 09:39:26 crc kubenswrapper[5127]: I0201 09:39:26.248836 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" path="/var/lib/kubelet/pods/95ce9244-f7ed-4182-b002-77c0ca7706fc/volumes" Feb 01 09:39:36 crc kubenswrapper[5127]: I0201 09:39:36.740928 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:39:36 crc kubenswrapper[5127]: I0201 09:39:36.741559 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:39:36 crc kubenswrapper[5127]: I0201 09:39:36.741652 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:39:36 crc kubenswrapper[5127]: I0201 09:39:36.742830 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:39:36 crc kubenswrapper[5127]: I0201 09:39:36.742957 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" gracePeriod=600 Feb 01 09:39:36 crc kubenswrapper[5127]: E0201 09:39:36.873645 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:39:37 crc kubenswrapper[5127]: I0201 09:39:37.165129 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" exitCode=0 Feb 01 09:39:37 crc kubenswrapper[5127]: I0201 09:39:37.165239 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff"} Feb 01 09:39:37 crc kubenswrapper[5127]: I0201 09:39:37.165534 5127 scope.go:117] "RemoveContainer" containerID="5ec2879d242ff56bb5def4fe5636c076b2aa81cf6a4f192db07a8f0fede3b2d3" Feb 01 09:39:37 crc kubenswrapper[5127]: I0201 09:39:37.166830 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:39:37 crc kubenswrapper[5127]: E0201 09:39:37.167486 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.633717 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f49rr"] Feb 01 09:39:43 crc kubenswrapper[5127]: E0201 09:39:43.634901 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="extract-utilities" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.634922 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="extract-utilities" Feb 01 09:39:43 crc kubenswrapper[5127]: E0201 09:39:43.634972 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="registry-server" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.634985 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="registry-server" Feb 01 09:39:43 crc kubenswrapper[5127]: E0201 09:39:43.635011 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="extract-content" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.635024 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="extract-content" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.635374 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ce9244-f7ed-4182-b002-77c0ca7706fc" containerName="registry-server" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.637931 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.654386 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f49rr"] Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.777394 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-catalog-content\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.777462 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-utilities\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.777828 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2w2\" (UniqueName: \"kubernetes.io/projected/8e6ee78e-b568-4bd1-8db5-b3711365ddef-kube-api-access-rn2w2\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.880831 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-catalog-content\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.880913 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-utilities\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.881073 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2w2\" (UniqueName: \"kubernetes.io/projected/8e6ee78e-b568-4bd1-8db5-b3711365ddef-kube-api-access-rn2w2\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.881435 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-catalog-content\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.881521 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-utilities\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.916462 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2w2\" (UniqueName: \"kubernetes.io/projected/8e6ee78e-b568-4bd1-8db5-b3711365ddef-kube-api-access-rn2w2\") pod \"community-operators-f49rr\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:43 crc kubenswrapper[5127]: I0201 09:39:43.989840 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:44 crc kubenswrapper[5127]: I0201 09:39:44.514655 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f49rr"] Feb 01 09:39:45 crc kubenswrapper[5127]: I0201 09:39:45.275067 5127 generic.go:334] "Generic (PLEG): container finished" podID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerID="ed61627389df19a0aa6344f110d8bed9bb08f314a03971455dfc9f34d4603671" exitCode=0 Feb 01 09:39:45 crc kubenswrapper[5127]: I0201 09:39:45.275413 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f49rr" event={"ID":"8e6ee78e-b568-4bd1-8db5-b3711365ddef","Type":"ContainerDied","Data":"ed61627389df19a0aa6344f110d8bed9bb08f314a03971455dfc9f34d4603671"} Feb 01 09:39:45 crc kubenswrapper[5127]: I0201 09:39:45.275452 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f49rr" event={"ID":"8e6ee78e-b568-4bd1-8db5-b3711365ddef","Type":"ContainerStarted","Data":"3c5fbaa5cc58939d36079517cb115d7e20f417cab04929909d0204d2f919e697"} Feb 01 09:39:46 crc kubenswrapper[5127]: I0201 09:39:46.293317 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f49rr" event={"ID":"8e6ee78e-b568-4bd1-8db5-b3711365ddef","Type":"ContainerStarted","Data":"ddf0f41da6911090227e408984f8ecc3f69552cb26b8ab424b9f9b8f4115fd23"} Feb 01 09:39:47 crc kubenswrapper[5127]: I0201 09:39:47.308668 5127 generic.go:334] "Generic (PLEG): container finished" podID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerID="ddf0f41da6911090227e408984f8ecc3f69552cb26b8ab424b9f9b8f4115fd23" exitCode=0 Feb 01 09:39:47 crc kubenswrapper[5127]: I0201 09:39:47.308784 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f49rr" event={"ID":"8e6ee78e-b568-4bd1-8db5-b3711365ddef","Type":"ContainerDied","Data":"ddf0f41da6911090227e408984f8ecc3f69552cb26b8ab424b9f9b8f4115fd23"} Feb 01 09:39:48 crc kubenswrapper[5127]: I0201 09:39:48.320693 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f49rr" event={"ID":"8e6ee78e-b568-4bd1-8db5-b3711365ddef","Type":"ContainerStarted","Data":"4d2920b83f4638e5651205db0742bdd410327abf6ccd945283e184814851e9b2"} Feb 01 09:39:48 crc kubenswrapper[5127]: I0201 09:39:48.344217 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f49rr" podStartSLOduration=2.893573548 podStartE2EDuration="5.344195712s" podCreationTimestamp="2026-02-01 09:39:43 +0000 UTC" firstStartedPulling="2026-02-01 09:39:45.279223277 +0000 UTC m=+10335.765125670" lastFinishedPulling="2026-02-01 09:39:47.729845431 +0000 UTC m=+10338.215747834" observedRunningTime="2026-02-01 09:39:48.339383084 +0000 UTC m=+10338.825285457" watchObservedRunningTime="2026-02-01 09:39:48.344195712 +0000 UTC m=+10338.830098115" Feb 01 09:39:50 crc kubenswrapper[5127]: I0201 09:39:50.248504 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:39:50 crc kubenswrapper[5127]: E0201 09:39:50.249339 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:39:53 crc kubenswrapper[5127]: I0201 09:39:53.990143 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:53 crc kubenswrapper[5127]: I0201 09:39:53.990892 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:54 crc kubenswrapper[5127]: I0201 09:39:54.080528 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:54 crc kubenswrapper[5127]: I0201 09:39:54.484660 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:54 crc kubenswrapper[5127]: I0201 09:39:54.552239 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f49rr"] Feb 01 09:39:56 crc kubenswrapper[5127]: I0201 09:39:56.448201 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f49rr" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="registry-server" containerID="cri-o://4d2920b83f4638e5651205db0742bdd410327abf6ccd945283e184814851e9b2" gracePeriod=2 Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.470004 5127 generic.go:334] "Generic (PLEG): container finished" podID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerID="4d2920b83f4638e5651205db0742bdd410327abf6ccd945283e184814851e9b2" exitCode=0 Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.470129 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f49rr" event={"ID":"8e6ee78e-b568-4bd1-8db5-b3711365ddef","Type":"ContainerDied","Data":"4d2920b83f4638e5651205db0742bdd410327abf6ccd945283e184814851e9b2"} Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.470417 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f49rr" event={"ID":"8e6ee78e-b568-4bd1-8db5-b3711365ddef","Type":"ContainerDied","Data":"3c5fbaa5cc58939d36079517cb115d7e20f417cab04929909d0204d2f919e697"} Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.470443 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c5fbaa5cc58939d36079517cb115d7e20f417cab04929909d0204d2f919e697" Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.530366 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.641156 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-utilities\") pod \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.641624 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-catalog-content\") pod \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.641811 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn2w2\" (UniqueName: \"kubernetes.io/projected/8e6ee78e-b568-4bd1-8db5-b3711365ddef-kube-api-access-rn2w2\") pod \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\" (UID: \"8e6ee78e-b568-4bd1-8db5-b3711365ddef\") " Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.642297 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-utilities" (OuterVolumeSpecName: "utilities") pod "8e6ee78e-b568-4bd1-8db5-b3711365ddef" (UID: "8e6ee78e-b568-4bd1-8db5-b3711365ddef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.642804 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.649724 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6ee78e-b568-4bd1-8db5-b3711365ddef-kube-api-access-rn2w2" (OuterVolumeSpecName: "kube-api-access-rn2w2") pod "8e6ee78e-b568-4bd1-8db5-b3711365ddef" (UID: "8e6ee78e-b568-4bd1-8db5-b3711365ddef"). InnerVolumeSpecName "kube-api-access-rn2w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.714063 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e6ee78e-b568-4bd1-8db5-b3711365ddef" (UID: "8e6ee78e-b568-4bd1-8db5-b3711365ddef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.744104 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e6ee78e-b568-4bd1-8db5-b3711365ddef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:39:57 crc kubenswrapper[5127]: I0201 09:39:57.744130 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn2w2\" (UniqueName: \"kubernetes.io/projected/8e6ee78e-b568-4bd1-8db5-b3711365ddef-kube-api-access-rn2w2\") on node \"crc\" DevicePath \"\"" Feb 01 09:39:58 crc kubenswrapper[5127]: I0201 09:39:58.486460 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f49rr" Feb 01 09:39:58 crc kubenswrapper[5127]: I0201 09:39:58.526465 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f49rr"] Feb 01 09:39:58 crc kubenswrapper[5127]: I0201 09:39:58.547729 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f49rr"] Feb 01 09:40:00 crc kubenswrapper[5127]: I0201 09:40:00.267090 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" path="/var/lib/kubelet/pods/8e6ee78e-b568-4bd1-8db5-b3711365ddef/volumes" Feb 01 09:40:02 crc kubenswrapper[5127]: I0201 09:40:02.236626 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:40:02 crc kubenswrapper[5127]: E0201 09:40:02.239412 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:40:02 crc kubenswrapper[5127]: I0201 09:40:02.994968 5127 trace.go:236] Trace[1073625851]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-1" (01-Feb-2026 09:40:01.954) (total time: 1040ms): Feb 01 09:40:02 crc kubenswrapper[5127]: Trace[1073625851]: [1.040387968s] [1.040387968s] END Feb 01 09:40:17 crc kubenswrapper[5127]: I0201 09:40:17.236293 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:40:17 crc kubenswrapper[5127]: E0201 09:40:17.237307 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:40:31 crc kubenswrapper[5127]: I0201 09:40:31.242618 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:40:31 crc kubenswrapper[5127]: E0201 09:40:31.243833 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:40:45 crc kubenswrapper[5127]: I0201 09:40:45.236488 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:40:45 crc kubenswrapper[5127]: E0201 09:40:45.237547 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.461523 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nthtc"] Feb 01 09:40:47 crc kubenswrapper[5127]: E0201 09:40:47.462478 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="registry-server" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.462493 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="registry-server" Feb 01 09:40:47 crc kubenswrapper[5127]: E0201 09:40:47.462512 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="extract-content" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.462518 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="extract-content" Feb 01 09:40:47 crc kubenswrapper[5127]: E0201 09:40:47.462535 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="extract-utilities" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.462541 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="extract-utilities" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.462758 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6ee78e-b568-4bd1-8db5-b3711365ddef" containerName="registry-server" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.464240 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.474023 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nthtc"] Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.652412 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5ncn\" (UniqueName: \"kubernetes.io/projected/c4aa5b08-6ada-451b-9697-cc808466bda2-kube-api-access-f5ncn\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.652495 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-catalog-content\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.652528 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-utilities\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.754439 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-catalog-content\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.754506 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-utilities\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.754645 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5ncn\" (UniqueName: \"kubernetes.io/projected/c4aa5b08-6ada-451b-9697-cc808466bda2-kube-api-access-f5ncn\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.755518 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-catalog-content\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.755743 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-utilities\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.774823 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5ncn\" (UniqueName: \"kubernetes.io/projected/c4aa5b08-6ada-451b-9697-cc808466bda2-kube-api-access-f5ncn\") pod \"redhat-marketplace-nthtc\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:47 crc kubenswrapper[5127]: I0201 09:40:47.818997 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:48 crc kubenswrapper[5127]: I0201 09:40:48.287938 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nthtc"] Feb 01 09:40:49 crc kubenswrapper[5127]: I0201 09:40:49.213420 5127 generic.go:334] "Generic (PLEG): container finished" podID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerID="2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4" exitCode=0 Feb 01 09:40:49 crc kubenswrapper[5127]: I0201 09:40:49.213842 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nthtc" event={"ID":"c4aa5b08-6ada-451b-9697-cc808466bda2","Type":"ContainerDied","Data":"2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4"} Feb 01 09:40:49 crc kubenswrapper[5127]: I0201 09:40:49.213885 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nthtc" event={"ID":"c4aa5b08-6ada-451b-9697-cc808466bda2","Type":"ContainerStarted","Data":"f79eba6606d5244cc4433ae8b8fd0366e1d3a8633cbc4a309524885ee6ed3b8a"} Feb 01 09:40:49 crc kubenswrapper[5127]: I0201 09:40:49.218757 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:40:50 crc kubenswrapper[5127]: I0201 09:40:50.230372 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nthtc" event={"ID":"c4aa5b08-6ada-451b-9697-cc808466bda2","Type":"ContainerStarted","Data":"d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487"} Feb 01 09:40:51 crc kubenswrapper[5127]: I0201 09:40:51.245245 5127 generic.go:334] "Generic (PLEG): container finished" podID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerID="d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487" exitCode=0 Feb 01 09:40:51 crc kubenswrapper[5127]: I0201 09:40:51.245412 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nthtc" event={"ID":"c4aa5b08-6ada-451b-9697-cc808466bda2","Type":"ContainerDied","Data":"d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487"} Feb 01 09:40:52 crc kubenswrapper[5127]: I0201 09:40:52.262922 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nthtc" event={"ID":"c4aa5b08-6ada-451b-9697-cc808466bda2","Type":"ContainerStarted","Data":"fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780"} Feb 01 09:40:52 crc kubenswrapper[5127]: I0201 09:40:52.299780 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nthtc" podStartSLOduration=2.806196908 podStartE2EDuration="5.299746964s" podCreationTimestamp="2026-02-01 09:40:47 +0000 UTC" firstStartedPulling="2026-02-01 09:40:49.218217166 +0000 UTC m=+10399.704119569" lastFinishedPulling="2026-02-01 09:40:51.711767222 +0000 UTC m=+10402.197669625" observedRunningTime="2026-02-01 09:40:52.28469983 +0000 UTC m=+10402.770602243" watchObservedRunningTime="2026-02-01 09:40:52.299746964 +0000 UTC m=+10402.785649367" Feb 01 09:40:57 crc kubenswrapper[5127]: I0201 09:40:57.819464 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:57 crc kubenswrapper[5127]: I0201 09:40:57.820211 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:57 crc kubenswrapper[5127]: I0201 09:40:57.887701 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:58 crc kubenswrapper[5127]: I0201 09:40:58.241705 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:40:58 crc kubenswrapper[5127]: E0201 09:40:58.242455 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:40:58 crc kubenswrapper[5127]: I0201 09:40:58.389191 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:40:58 crc kubenswrapper[5127]: I0201 09:40:58.445087 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nthtc"] Feb 01 09:41:00 crc kubenswrapper[5127]: I0201 09:41:00.347730 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nthtc" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="registry-server" containerID="cri-o://fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780" gracePeriod=2 Feb 01 09:41:00 crc kubenswrapper[5127]: I0201 09:41:00.976145 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.176762 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-utilities\") pod \"c4aa5b08-6ada-451b-9697-cc808466bda2\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.176993 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5ncn\" (UniqueName: \"kubernetes.io/projected/c4aa5b08-6ada-451b-9697-cc808466bda2-kube-api-access-f5ncn\") pod \"c4aa5b08-6ada-451b-9697-cc808466bda2\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.177160 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-catalog-content\") pod \"c4aa5b08-6ada-451b-9697-cc808466bda2\" (UID: \"c4aa5b08-6ada-451b-9697-cc808466bda2\") " Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.177891 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-utilities" (OuterVolumeSpecName: "utilities") pod "c4aa5b08-6ada-451b-9697-cc808466bda2" (UID: "c4aa5b08-6ada-451b-9697-cc808466bda2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.191887 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4aa5b08-6ada-451b-9697-cc808466bda2-kube-api-access-f5ncn" (OuterVolumeSpecName: "kube-api-access-f5ncn") pod "c4aa5b08-6ada-451b-9697-cc808466bda2" (UID: "c4aa5b08-6ada-451b-9697-cc808466bda2"). InnerVolumeSpecName "kube-api-access-f5ncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.204153 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4aa5b08-6ada-451b-9697-cc808466bda2" (UID: "c4aa5b08-6ada-451b-9697-cc808466bda2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.280688 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.280748 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4aa5b08-6ada-451b-9697-cc808466bda2-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.280772 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5ncn\" (UniqueName: \"kubernetes.io/projected/c4aa5b08-6ada-451b-9697-cc808466bda2-kube-api-access-f5ncn\") on node \"crc\" DevicePath \"\"" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.366045 5127 generic.go:334] "Generic (PLEG): container finished" podID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerID="fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780" exitCode=0 Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.366111 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nthtc" event={"ID":"c4aa5b08-6ada-451b-9697-cc808466bda2","Type":"ContainerDied","Data":"fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780"} Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.366151 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nthtc" event={"ID":"c4aa5b08-6ada-451b-9697-cc808466bda2","Type":"ContainerDied","Data":"f79eba6606d5244cc4433ae8b8fd0366e1d3a8633cbc4a309524885ee6ed3b8a"} Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.366182 5127 scope.go:117] "RemoveContainer" containerID="fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.366390 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nthtc" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.401517 5127 scope.go:117] "RemoveContainer" containerID="d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.420640 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nthtc"] Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.429921 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nthtc"] Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.450013 5127 scope.go:117] "RemoveContainer" containerID="2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.487067 5127 scope.go:117] "RemoveContainer" containerID="fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780" Feb 01 09:41:01 crc kubenswrapper[5127]: E0201 09:41:01.489111 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780\": container with ID starting with fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780 not found: ID does not exist" containerID="fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.489149 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780"} err="failed to get container status \"fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780\": rpc error: code = NotFound desc = could not find container \"fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780\": container with ID starting with fdfd8c7434e3953bbfb39df2f89839dcca9b2251e0d56b88ee8afb95c32a6780 not found: ID does not exist" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.489173 5127 scope.go:117] "RemoveContainer" containerID="d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487" Feb 01 09:41:01 crc kubenswrapper[5127]: E0201 09:41:01.491612 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487\": container with ID starting with d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487 not found: ID does not exist" containerID="d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.491651 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487"} err="failed to get container status \"d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487\": rpc error: code = NotFound desc = could not find container \"d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487\": container with ID starting with d5384b31c049139a1a48a41d0101be2369cdf98e0b668203367b796cfaf22487 not found: ID does not exist" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.491670 5127 scope.go:117] "RemoveContainer" containerID="2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4" Feb 01 09:41:01 crc kubenswrapper[5127]: E0201 09:41:01.492010 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4\": container with ID starting with 2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4 not found: ID does not exist" containerID="2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4" Feb 01 09:41:01 crc kubenswrapper[5127]: I0201 09:41:01.492043 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4"} err="failed to get container status \"2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4\": rpc error: code = NotFound desc = could not find container \"2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4\": container with ID starting with 2021d43bd896a574b625ef5660f9f5f534a3e2e298d98f2fc443258f3e4cbad4 not found: ID does not exist" Feb 01 09:41:02 crc kubenswrapper[5127]: I0201 09:41:02.254750 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" path="/var/lib/kubelet/pods/c4aa5b08-6ada-451b-9697-cc808466bda2/volumes" Feb 01 09:41:11 crc kubenswrapper[5127]: I0201 09:41:11.235791 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:41:11 crc kubenswrapper[5127]: E0201 09:41:11.236569 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:41:26 crc kubenswrapper[5127]: I0201 09:41:26.236514 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:41:26 crc kubenswrapper[5127]: E0201 09:41:26.237654 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:41:38 crc kubenswrapper[5127]: I0201 09:41:38.235443 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:41:38 crc kubenswrapper[5127]: E0201 09:41:38.237228 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:41:53 crc kubenswrapper[5127]: I0201 09:41:53.236733 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:41:53 crc kubenswrapper[5127]: E0201 09:41:53.237575 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:42:04 crc kubenswrapper[5127]: I0201 09:42:04.235991 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:42:04 crc kubenswrapper[5127]: E0201 09:42:04.236593 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:42:15 crc kubenswrapper[5127]: I0201 09:42:15.236004 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:42:15 crc kubenswrapper[5127]: E0201 09:42:15.237414 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:42:30 crc kubenswrapper[5127]: I0201 09:42:30.256491 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:42:30 crc kubenswrapper[5127]: E0201 09:42:30.257610 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:42:42 crc kubenswrapper[5127]: I0201 09:42:42.236097 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:42:42 crc kubenswrapper[5127]: E0201 09:42:42.237210 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:42:56 crc kubenswrapper[5127]: I0201 09:42:56.235402 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:42:56 crc kubenswrapper[5127]: E0201 09:42:56.236218 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:43:07 crc kubenswrapper[5127]: I0201 09:43:07.236316 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:43:07 crc kubenswrapper[5127]: E0201 09:43:07.237195 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:43:22 crc kubenswrapper[5127]: I0201 09:43:22.236048 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:43:22 crc kubenswrapper[5127]: E0201 09:43:22.236821 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:43:34 crc kubenswrapper[5127]: I0201 09:43:34.235409 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:43:34 crc kubenswrapper[5127]: E0201 09:43:34.236710 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:43:49 crc kubenswrapper[5127]: I0201 09:43:49.235964 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:43:49 crc kubenswrapper[5127]: E0201 09:43:49.237255 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:44:03 crc kubenswrapper[5127]: I0201 09:44:03.235729 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:44:03 crc kubenswrapper[5127]: E0201 09:44:03.236387 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:44:18 crc kubenswrapper[5127]: I0201 09:44:18.236769 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:44:18 crc kubenswrapper[5127]: E0201 09:44:18.238330 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:44:29 crc kubenswrapper[5127]: I0201 09:44:29.235366 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:44:29 crc kubenswrapper[5127]: E0201 09:44:29.236083 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:44:42 crc kubenswrapper[5127]: I0201 09:44:42.236525 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:44:42 crc kubenswrapper[5127]: I0201 09:44:42.535744 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"3057b89d670a74eee49f051c22e764de4abf61d8b64af0f4872542456609068f"} Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.186884 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw"] Feb 01 09:45:00 crc kubenswrapper[5127]: E0201 09:45:00.189107 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="extract-content" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.189231 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="extract-content" Feb 01 09:45:00 crc kubenswrapper[5127]: E0201 09:45:00.189338 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="extract-utilities" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.189415 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="extract-utilities" Feb 01 09:45:00 crc kubenswrapper[5127]: E0201 09:45:00.189518 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="registry-server" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.189639 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="registry-server" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.189976 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4aa5b08-6ada-451b-9697-cc808466bda2" containerName="registry-server" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.191226 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.193380 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.193763 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.204906 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw"] Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.232390 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32c9a7ca-130f-4b16-8cba-4ae5144711d6-secret-volume\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.232455 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32c9a7ca-130f-4b16-8cba-4ae5144711d6-config-volume\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.232644 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhhfk\" (UniqueName: \"kubernetes.io/projected/32c9a7ca-130f-4b16-8cba-4ae5144711d6-kube-api-access-hhhfk\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.333748 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhhfk\" (UniqueName: \"kubernetes.io/projected/32c9a7ca-130f-4b16-8cba-4ae5144711d6-kube-api-access-hhhfk\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.334363 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32c9a7ca-130f-4b16-8cba-4ae5144711d6-secret-volume\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.334421 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32c9a7ca-130f-4b16-8cba-4ae5144711d6-config-volume\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.335463 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32c9a7ca-130f-4b16-8cba-4ae5144711d6-config-volume\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.343419 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32c9a7ca-130f-4b16-8cba-4ae5144711d6-secret-volume\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.349019 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhhfk\" (UniqueName: \"kubernetes.io/projected/32c9a7ca-130f-4b16-8cba-4ae5144711d6-kube-api-access-hhhfk\") pod \"collect-profiles-29498985-dxxrw\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:00 crc kubenswrapper[5127]: I0201 09:45:00.517740 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:01 crc kubenswrapper[5127]: W0201 09:45:01.019393 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c9a7ca_130f_4b16_8cba_4ae5144711d6.slice/crio-c74abde2e72d0681bd8e4c349fc5ffa20c48cd45f074f01ce09605f950b35f6a WatchSource:0}: Error finding container c74abde2e72d0681bd8e4c349fc5ffa20c48cd45f074f01ce09605f950b35f6a: Status 404 returned error can't find the container with id c74abde2e72d0681bd8e4c349fc5ffa20c48cd45f074f01ce09605f950b35f6a Feb 01 09:45:01 crc kubenswrapper[5127]: I0201 09:45:01.021484 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw"] Feb 01 09:45:01 crc kubenswrapper[5127]: I0201 09:45:01.788901 5127 generic.go:334] "Generic (PLEG): container finished" podID="32c9a7ca-130f-4b16-8cba-4ae5144711d6" containerID="1546e4c5a0b94ec520f054b3c6b12798d746cdb0c6406519ec919b5d34374545" exitCode=0 Feb 01 09:45:01 crc kubenswrapper[5127]: I0201 09:45:01.789041 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" event={"ID":"32c9a7ca-130f-4b16-8cba-4ae5144711d6","Type":"ContainerDied","Data":"1546e4c5a0b94ec520f054b3c6b12798d746cdb0c6406519ec919b5d34374545"} Feb 01 09:45:01 crc kubenswrapper[5127]: I0201 09:45:01.789383 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" event={"ID":"32c9a7ca-130f-4b16-8cba-4ae5144711d6","Type":"ContainerStarted","Data":"c74abde2e72d0681bd8e4c349fc5ffa20c48cd45f074f01ce09605f950b35f6a"} Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.265126 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.330355 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhhfk\" (UniqueName: \"kubernetes.io/projected/32c9a7ca-130f-4b16-8cba-4ae5144711d6-kube-api-access-hhhfk\") pod \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.330499 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32c9a7ca-130f-4b16-8cba-4ae5144711d6-secret-volume\") pod \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.330531 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32c9a7ca-130f-4b16-8cba-4ae5144711d6-config-volume\") pod \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\" (UID: \"32c9a7ca-130f-4b16-8cba-4ae5144711d6\") " Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.331812 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32c9a7ca-130f-4b16-8cba-4ae5144711d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "32c9a7ca-130f-4b16-8cba-4ae5144711d6" (UID: "32c9a7ca-130f-4b16-8cba-4ae5144711d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.337177 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32c9a7ca-130f-4b16-8cba-4ae5144711d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "32c9a7ca-130f-4b16-8cba-4ae5144711d6" (UID: "32c9a7ca-130f-4b16-8cba-4ae5144711d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.337854 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c9a7ca-130f-4b16-8cba-4ae5144711d6-kube-api-access-hhhfk" (OuterVolumeSpecName: "kube-api-access-hhhfk") pod "32c9a7ca-130f-4b16-8cba-4ae5144711d6" (UID: "32c9a7ca-130f-4b16-8cba-4ae5144711d6"). InnerVolumeSpecName "kube-api-access-hhhfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.433854 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/32c9a7ca-130f-4b16-8cba-4ae5144711d6-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.433897 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32c9a7ca-130f-4b16-8cba-4ae5144711d6-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.433917 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhhfk\" (UniqueName: \"kubernetes.io/projected/32c9a7ca-130f-4b16-8cba-4ae5144711d6-kube-api-access-hhhfk\") on node \"crc\" DevicePath \"\"" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.819860 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" event={"ID":"32c9a7ca-130f-4b16-8cba-4ae5144711d6","Type":"ContainerDied","Data":"c74abde2e72d0681bd8e4c349fc5ffa20c48cd45f074f01ce09605f950b35f6a"} Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.819922 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c74abde2e72d0681bd8e4c349fc5ffa20c48cd45f074f01ce09605f950b35f6a" Feb 01 09:45:03 crc kubenswrapper[5127]: I0201 09:45:03.819968 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29498985-dxxrw" Feb 01 09:45:04 crc kubenswrapper[5127]: I0201 09:45:04.364141 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl"] Feb 01 09:45:04 crc kubenswrapper[5127]: I0201 09:45:04.377254 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498940-hlxfl"] Feb 01 09:45:06 crc kubenswrapper[5127]: I0201 09:45:06.250250 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049" path="/var/lib/kubelet/pods/445bd8b4-0a63-4b4d-a81d-d7c9cb4ba049/volumes" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.069227 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lzlgh"] Feb 01 09:45:23 crc kubenswrapper[5127]: E0201 09:45:23.070670 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c9a7ca-130f-4b16-8cba-4ae5144711d6" containerName="collect-profiles" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.070692 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c9a7ca-130f-4b16-8cba-4ae5144711d6" containerName="collect-profiles" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.070997 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c9a7ca-130f-4b16-8cba-4ae5144711d6" containerName="collect-profiles" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.073069 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.087009 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzlgh"] Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.127650 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-utilities\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.127743 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-catalog-content\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.127775 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f25z\" (UniqueName: \"kubernetes.io/projected/94cfc74b-fd37-4676-a6a8-eda9407b0aca-kube-api-access-8f25z\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.229716 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-catalog-content\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.229779 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f25z\" (UniqueName: \"kubernetes.io/projected/94cfc74b-fd37-4676-a6a8-eda9407b0aca-kube-api-access-8f25z\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.230020 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-utilities\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.230756 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-utilities\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.230900 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-catalog-content\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.250601 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f25z\" (UniqueName: \"kubernetes.io/projected/94cfc74b-fd37-4676-a6a8-eda9407b0aca-kube-api-access-8f25z\") pod \"redhat-operators-lzlgh\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.446066 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:23 crc kubenswrapper[5127]: I0201 09:45:23.925367 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzlgh"] Feb 01 09:45:24 crc kubenswrapper[5127]: I0201 09:45:24.152962 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerStarted","Data":"8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946"} Feb 01 09:45:24 crc kubenswrapper[5127]: I0201 09:45:24.153908 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerStarted","Data":"e1c2760cc2dbbd4314b92dc4931f01b9895346a09b25a93f0eb7eb35f9d388cc"} Feb 01 09:45:25 crc kubenswrapper[5127]: I0201 09:45:25.171981 5127 generic.go:334] "Generic (PLEG): container finished" podID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerID="8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946" exitCode=0 Feb 01 09:45:25 crc kubenswrapper[5127]: I0201 09:45:25.172063 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerDied","Data":"8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946"} Feb 01 09:45:26 crc kubenswrapper[5127]: I0201 09:45:26.188680 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerStarted","Data":"620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73"} Feb 01 09:45:29 crc kubenswrapper[5127]: I0201 09:45:29.237460 5127 generic.go:334] "Generic (PLEG): container finished" podID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerID="620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73" exitCode=0 Feb 01 09:45:29 crc kubenswrapper[5127]: I0201 09:45:29.237643 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerDied","Data":"620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73"} Feb 01 09:45:30 crc kubenswrapper[5127]: I0201 09:45:30.278942 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerStarted","Data":"aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d"} Feb 01 09:45:30 crc kubenswrapper[5127]: I0201 09:45:30.317606 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lzlgh" podStartSLOduration=2.794672417 podStartE2EDuration="7.31756468s" podCreationTimestamp="2026-02-01 09:45:23 +0000 UTC" firstStartedPulling="2026-02-01 09:45:25.178097946 +0000 UTC m=+10675.664000349" lastFinishedPulling="2026-02-01 09:45:29.700990229 +0000 UTC m=+10680.186892612" observedRunningTime="2026-02-01 09:45:30.30530544 +0000 UTC m=+10680.791207823" watchObservedRunningTime="2026-02-01 09:45:30.31756468 +0000 UTC m=+10680.803467043" Feb 01 09:45:33 crc kubenswrapper[5127]: I0201 09:45:33.446964 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:33 crc kubenswrapper[5127]: I0201 09:45:33.447495 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:34 crc kubenswrapper[5127]: I0201 09:45:34.512694 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lzlgh" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="registry-server" probeResult="failure" output=< Feb 01 09:45:34 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:45:34 crc kubenswrapper[5127]: > Feb 01 09:45:43 crc kubenswrapper[5127]: I0201 09:45:43.531448 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:43 crc kubenswrapper[5127]: I0201 09:45:43.628116 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:43 crc kubenswrapper[5127]: I0201 09:45:43.785850 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzlgh"] Feb 01 09:45:44 crc kubenswrapper[5127]: I0201 09:45:44.865093 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lzlgh" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="registry-server" containerID="cri-o://aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d" gracePeriod=2 Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.372248 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.418843 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-catalog-content\") pod \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.419361 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-utilities\") pod \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.419447 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f25z\" (UniqueName: \"kubernetes.io/projected/94cfc74b-fd37-4676-a6a8-eda9407b0aca-kube-api-access-8f25z\") pod \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\" (UID: \"94cfc74b-fd37-4676-a6a8-eda9407b0aca\") " Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.420351 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-utilities" (OuterVolumeSpecName: "utilities") pod "94cfc74b-fd37-4676-a6a8-eda9407b0aca" (UID: "94cfc74b-fd37-4676-a6a8-eda9407b0aca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.428427 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94cfc74b-fd37-4676-a6a8-eda9407b0aca-kube-api-access-8f25z" (OuterVolumeSpecName: "kube-api-access-8f25z") pod "94cfc74b-fd37-4676-a6a8-eda9407b0aca" (UID: "94cfc74b-fd37-4676-a6a8-eda9407b0aca"). InnerVolumeSpecName "kube-api-access-8f25z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.522338 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f25z\" (UniqueName: \"kubernetes.io/projected/94cfc74b-fd37-4676-a6a8-eda9407b0aca-kube-api-access-8f25z\") on node \"crc\" DevicePath \"\"" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.522387 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.569108 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94cfc74b-fd37-4676-a6a8-eda9407b0aca" (UID: "94cfc74b-fd37-4676-a6a8-eda9407b0aca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.623661 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94cfc74b-fd37-4676-a6a8-eda9407b0aca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.876355 5127 generic.go:334] "Generic (PLEG): container finished" podID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerID="aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d" exitCode=0 Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.876396 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerDied","Data":"aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d"} Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.876422 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzlgh" event={"ID":"94cfc74b-fd37-4676-a6a8-eda9407b0aca","Type":"ContainerDied","Data":"e1c2760cc2dbbd4314b92dc4931f01b9895346a09b25a93f0eb7eb35f9d388cc"} Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.876440 5127 scope.go:117] "RemoveContainer" containerID="aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.876540 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzlgh" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.910172 5127 scope.go:117] "RemoveContainer" containerID="620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.913769 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzlgh"] Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.924425 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lzlgh"] Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.933951 5127 scope.go:117] "RemoveContainer" containerID="8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.985360 5127 scope.go:117] "RemoveContainer" containerID="aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d" Feb 01 09:45:45 crc kubenswrapper[5127]: E0201 09:45:45.986140 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d\": container with ID starting with aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d not found: ID does not exist" containerID="aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.986200 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d"} err="failed to get container status \"aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d\": rpc error: code = NotFound desc = could not find container \"aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d\": container with ID starting with aa1e83e2efea3aa4ec350cdd5e6a7ecf071040ac7e07fad140246cf542725c4d not found: ID does not exist" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.986229 5127 scope.go:117] "RemoveContainer" containerID="620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73" Feb 01 09:45:45 crc kubenswrapper[5127]: E0201 09:45:45.986855 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73\": container with ID starting with 620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73 not found: ID does not exist" containerID="620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.986887 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73"} err="failed to get container status \"620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73\": rpc error: code = NotFound desc = could not find container \"620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73\": container with ID starting with 620868b213a239d07867cb23704bd320b265b4096a773332fd9bd2f11dc59b73 not found: ID does not exist" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.986910 5127 scope.go:117] "RemoveContainer" containerID="8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946" Feb 01 09:45:45 crc kubenswrapper[5127]: E0201 09:45:45.987298 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946\": container with ID starting with 8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946 not found: ID does not exist" containerID="8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946" Feb 01 09:45:45 crc kubenswrapper[5127]: I0201 09:45:45.987324 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946"} err="failed to get container status \"8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946\": rpc error: code = NotFound desc = could not find container \"8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946\": container with ID starting with 8fd000472d1c6242231e4071d3d5b05b49fc783402735f0f8be2f20a0438a946 not found: ID does not exist" Feb 01 09:45:46 crc kubenswrapper[5127]: I0201 09:45:46.258332 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" path="/var/lib/kubelet/pods/94cfc74b-fd37-4676-a6a8-eda9407b0aca/volumes" Feb 01 09:45:50 crc kubenswrapper[5127]: I0201 09:45:50.612133 5127 scope.go:117] "RemoveContainer" containerID="ddf0f41da6911090227e408984f8ecc3f69552cb26b8ab424b9f9b8f4115fd23" Feb 01 09:45:51 crc kubenswrapper[5127]: I0201 09:45:51.011459 5127 scope.go:117] "RemoveContainer" containerID="4d2920b83f4638e5651205db0742bdd410327abf6ccd945283e184814851e9b2" Feb 01 09:45:51 crc kubenswrapper[5127]: I0201 09:45:51.064109 5127 scope.go:117] "RemoveContainer" containerID="ed61627389df19a0aa6344f110d8bed9bb08f314a03971455dfc9f34d4603671" Feb 01 09:45:51 crc kubenswrapper[5127]: I0201 09:45:51.092807 5127 scope.go:117] "RemoveContainer" containerID="dbf8b0f82807a556d598bf436c2ef0c0a14d78cc278a0509b4c0b8e1ed3f9579" Feb 01 09:47:06 crc kubenswrapper[5127]: I0201 09:47:06.741021 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:47:06 crc kubenswrapper[5127]: I0201 09:47:06.741772 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:47:36 crc kubenswrapper[5127]: I0201 09:47:36.741222 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:47:36 crc kubenswrapper[5127]: I0201 09:47:36.741886 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:48:06 crc kubenswrapper[5127]: I0201 09:48:06.740944 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:48:06 crc kubenswrapper[5127]: I0201 09:48:06.741960 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:48:06 crc kubenswrapper[5127]: I0201 09:48:06.742045 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:48:06 crc kubenswrapper[5127]: I0201 09:48:06.743264 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3057b89d670a74eee49f051c22e764de4abf61d8b64af0f4872542456609068f"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:48:06 crc kubenswrapper[5127]: I0201 09:48:06.743366 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://3057b89d670a74eee49f051c22e764de4abf61d8b64af0f4872542456609068f" gracePeriod=600 Feb 01 09:48:07 crc kubenswrapper[5127]: I0201 09:48:07.835844 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="3057b89d670a74eee49f051c22e764de4abf61d8b64af0f4872542456609068f" exitCode=0 Feb 01 09:48:07 crc kubenswrapper[5127]: I0201 09:48:07.835904 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"3057b89d670a74eee49f051c22e764de4abf61d8b64af0f4872542456609068f"} Feb 01 09:48:07 crc kubenswrapper[5127]: I0201 09:48:07.836392 5127 scope.go:117] "RemoveContainer" containerID="6758a63de29523c0bf26cfd650a9753456091ea05fa0e77cec4375e0401531ff" Feb 01 09:48:08 crc kubenswrapper[5127]: I0201 09:48:08.857057 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6"} Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.831120 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8cmp4"] Feb 01 09:49:22 crc kubenswrapper[5127]: E0201 09:49:22.832193 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="extract-utilities" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.832211 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="extract-utilities" Feb 01 09:49:22 crc kubenswrapper[5127]: E0201 09:49:22.832238 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="registry-server" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.832246 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="registry-server" Feb 01 09:49:22 crc kubenswrapper[5127]: E0201 09:49:22.832280 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="extract-content" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.832288 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="extract-content" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.835498 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="94cfc74b-fd37-4676-a6a8-eda9407b0aca" containerName="registry-server" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.874208 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.910917 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cmp4"] Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.992035 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-catalog-content\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.993253 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgjnm\" (UniqueName: \"kubernetes.io/projected/49851744-5210-4488-a154-bd57092fab74-kube-api-access-cgjnm\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:22 crc kubenswrapper[5127]: I0201 09:49:22.993353 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-utilities\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.094634 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-utilities\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.095044 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-catalog-content\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.095127 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-utilities\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.095241 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgjnm\" (UniqueName: \"kubernetes.io/projected/49851744-5210-4488-a154-bd57092fab74-kube-api-access-cgjnm\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.095442 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-catalog-content\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.132555 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgjnm\" (UniqueName: \"kubernetes.io/projected/49851744-5210-4488-a154-bd57092fab74-kube-api-access-cgjnm\") pod \"certified-operators-8cmp4\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.214104 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.783952 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cmp4"] Feb 01 09:49:23 crc kubenswrapper[5127]: I0201 09:49:23.815740 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cmp4" event={"ID":"49851744-5210-4488-a154-bd57092fab74","Type":"ContainerStarted","Data":"a6d4b8c7f35f29e047ab27316910f033c650953550f62c6a3626bdc031e2c81c"} Feb 01 09:49:24 crc kubenswrapper[5127]: I0201 09:49:24.828135 5127 generic.go:334] "Generic (PLEG): container finished" podID="49851744-5210-4488-a154-bd57092fab74" containerID="727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf" exitCode=0 Feb 01 09:49:24 crc kubenswrapper[5127]: I0201 09:49:24.828245 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cmp4" event={"ID":"49851744-5210-4488-a154-bd57092fab74","Type":"ContainerDied","Data":"727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf"} Feb 01 09:49:24 crc kubenswrapper[5127]: I0201 09:49:24.831212 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 09:49:27 crc kubenswrapper[5127]: I0201 09:49:27.869759 5127 generic.go:334] "Generic (PLEG): container finished" podID="49851744-5210-4488-a154-bd57092fab74" containerID="a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b" exitCode=0 Feb 01 09:49:27 crc kubenswrapper[5127]: I0201 09:49:27.870173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cmp4" event={"ID":"49851744-5210-4488-a154-bd57092fab74","Type":"ContainerDied","Data":"a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b"} Feb 01 09:49:28 crc kubenswrapper[5127]: I0201 09:49:28.890435 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cmp4" event={"ID":"49851744-5210-4488-a154-bd57092fab74","Type":"ContainerStarted","Data":"c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6"} Feb 01 09:49:28 crc kubenswrapper[5127]: I0201 09:49:28.938752 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8cmp4" podStartSLOduration=3.4061485080000002 podStartE2EDuration="6.938729738s" podCreationTimestamp="2026-02-01 09:49:22 +0000 UTC" firstStartedPulling="2026-02-01 09:49:24.830905008 +0000 UTC m=+10915.316807381" lastFinishedPulling="2026-02-01 09:49:28.363486218 +0000 UTC m=+10918.849388611" observedRunningTime="2026-02-01 09:49:28.921214537 +0000 UTC m=+10919.407116910" watchObservedRunningTime="2026-02-01 09:49:28.938729738 +0000 UTC m=+10919.424632111" Feb 01 09:49:33 crc kubenswrapper[5127]: I0201 09:49:33.215053 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:33 crc kubenswrapper[5127]: I0201 09:49:33.216188 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:33 crc kubenswrapper[5127]: I0201 09:49:33.297818 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:34 crc kubenswrapper[5127]: I0201 09:49:34.020224 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:34 crc kubenswrapper[5127]: I0201 09:49:34.103394 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cmp4"] Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.005311 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8cmp4" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="registry-server" containerID="cri-o://c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6" gracePeriod=2 Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.526061 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.543948 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-utilities\") pod \"49851744-5210-4488-a154-bd57092fab74\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.544063 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgjnm\" (UniqueName: \"kubernetes.io/projected/49851744-5210-4488-a154-bd57092fab74-kube-api-access-cgjnm\") pod \"49851744-5210-4488-a154-bd57092fab74\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.544248 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-catalog-content\") pod \"49851744-5210-4488-a154-bd57092fab74\" (UID: \"49851744-5210-4488-a154-bd57092fab74\") " Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.544970 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-utilities" (OuterVolumeSpecName: "utilities") pod "49851744-5210-4488-a154-bd57092fab74" (UID: "49851744-5210-4488-a154-bd57092fab74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.550568 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49851744-5210-4488-a154-bd57092fab74-kube-api-access-cgjnm" (OuterVolumeSpecName: "kube-api-access-cgjnm") pod "49851744-5210-4488-a154-bd57092fab74" (UID: "49851744-5210-4488-a154-bd57092fab74"). InnerVolumeSpecName "kube-api-access-cgjnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.614533 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49851744-5210-4488-a154-bd57092fab74" (UID: "49851744-5210-4488-a154-bd57092fab74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.646525 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.646568 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgjnm\" (UniqueName: \"kubernetes.io/projected/49851744-5210-4488-a154-bd57092fab74-kube-api-access-cgjnm\") on node \"crc\" DevicePath \"\"" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.646619 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49851744-5210-4488-a154-bd57092fab74-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.719951 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 01 09:49:36 crc kubenswrapper[5127]: E0201 09:49:36.720604 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="registry-server" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.720632 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="registry-server" Feb 01 09:49:36 crc kubenswrapper[5127]: E0201 09:49:36.721111 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="extract-utilities" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.721124 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="extract-utilities" Feb 01 09:49:36 crc kubenswrapper[5127]: E0201 09:49:36.721157 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="extract-content" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.721166 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="extract-content" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.721400 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="49851744-5210-4488-a154-bd57092fab74" containerName="registry-server" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.722416 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.728241 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.728686 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.728760 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.728787 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hxbhz" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.735635 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.749732 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.749923 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.750001 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.750213 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-config-data\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.750305 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.750361 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.750482 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.750833 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.751016 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxhx\" (UniqueName: \"kubernetes.io/projected/068a067f-bb12-4a63-a3ed-7eb05da0ca52-kube-api-access-kpxhx\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854059 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854241 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854482 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854639 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854697 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxhx\" (UniqueName: \"kubernetes.io/projected/068a067f-bb12-4a63-a3ed-7eb05da0ca52-kube-api-access-kpxhx\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854776 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854876 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.854950 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.855071 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-config-data\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.855159 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.856350 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.856846 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.857644 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.858324 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-config-data\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.861365 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.863061 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.874972 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.883755 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxhx\" (UniqueName: \"kubernetes.io/projected/068a067f-bb12-4a63-a3ed-7eb05da0ca52-kube-api-access-kpxhx\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:36 crc kubenswrapper[5127]: I0201 09:49:36.908018 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " pod="openstack/tempest-tests-tempest" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.020989 5127 generic.go:334] "Generic (PLEG): container finished" podID="49851744-5210-4488-a154-bd57092fab74" containerID="c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6" exitCode=0 Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.021066 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cmp4" event={"ID":"49851744-5210-4488-a154-bd57092fab74","Type":"ContainerDied","Data":"c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6"} Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.021130 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cmp4" event={"ID":"49851744-5210-4488-a154-bd57092fab74","Type":"ContainerDied","Data":"a6d4b8c7f35f29e047ab27316910f033c650953550f62c6a3626bdc031e2c81c"} Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.021126 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cmp4" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.021157 5127 scope.go:117] "RemoveContainer" containerID="c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.037793 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.074270 5127 scope.go:117] "RemoveContainer" containerID="a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.095143 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cmp4"] Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.109286 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8cmp4"] Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.122431 5127 scope.go:117] "RemoveContainer" containerID="727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.271023 5127 scope.go:117] "RemoveContainer" containerID="c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6" Feb 01 09:49:37 crc kubenswrapper[5127]: E0201 09:49:37.272049 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6\": container with ID starting with c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6 not found: ID does not exist" containerID="c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.272105 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6"} err="failed to get container status \"c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6\": rpc error: code = NotFound desc = could not find container \"c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6\": container with ID starting with c4296d2de51dc7853e4ccc25df05864da76a622ce67ca2b4a5ef6a827d418eb6 not found: ID does not exist" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.272139 5127 scope.go:117] "RemoveContainer" containerID="a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b" Feb 01 09:49:37 crc kubenswrapper[5127]: E0201 09:49:37.272576 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b\": container with ID starting with a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b not found: ID does not exist" containerID="a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.272616 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b"} err="failed to get container status \"a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b\": rpc error: code = NotFound desc = could not find container \"a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b\": container with ID starting with a85dba48991340bdc8622d67e21e61bce28d92013f76c7d052a03f1dd71b3f7b not found: ID does not exist" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.272629 5127 scope.go:117] "RemoveContainer" containerID="727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf" Feb 01 09:49:37 crc kubenswrapper[5127]: E0201 09:49:37.272833 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf\": container with ID starting with 727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf not found: ID does not exist" containerID="727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.272857 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf"} err="failed to get container status \"727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf\": rpc error: code = NotFound desc = could not find container \"727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf\": container with ID starting with 727e713c453f654ff1520a87e3c8c0b9afcca6fb92502ab052b88cf52f10f6bf not found: ID does not exist" Feb 01 09:49:37 crc kubenswrapper[5127]: I0201 09:49:37.632691 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 01 09:49:37 crc kubenswrapper[5127]: W0201 09:49:37.640838 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod068a067f_bb12_4a63_a3ed_7eb05da0ca52.slice/crio-0c8da886155c4ebc5f1c6c78ea94397be421656a3acc578c08938135955a5fc1 WatchSource:0}: Error finding container 0c8da886155c4ebc5f1c6c78ea94397be421656a3acc578c08938135955a5fc1: Status 404 returned error can't find the container with id 0c8da886155c4ebc5f1c6c78ea94397be421656a3acc578c08938135955a5fc1 Feb 01 09:49:38 crc kubenswrapper[5127]: I0201 09:49:38.032530 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"068a067f-bb12-4a63-a3ed-7eb05da0ca52","Type":"ContainerStarted","Data":"0c8da886155c4ebc5f1c6c78ea94397be421656a3acc578c08938135955a5fc1"} Feb 01 09:49:38 crc kubenswrapper[5127]: I0201 09:49:38.250482 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49851744-5210-4488-a154-bd57092fab74" path="/var/lib/kubelet/pods/49851744-5210-4488-a154-bd57092fab74/volumes" Feb 01 09:50:27 crc kubenswrapper[5127]: E0201 09:50:27.533069 5127 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 09:50:27 crc kubenswrapper[5127]: E0201 09:50:27.533532 5127 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871" Feb 01 09:50:27 crc kubenswrapper[5127]: E0201 09:50:27.533710 5127 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kpxhx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(068a067f-bb12-4a63-a3ed-7eb05da0ca52): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 09:50:27 crc kubenswrapper[5127]: E0201 09:50:27.534906 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="068a067f-bb12-4a63-a3ed-7eb05da0ca52" Feb 01 09:50:27 crc kubenswrapper[5127]: E0201 09:50:27.591947 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:8a0e02dd0fb8f726038072d0e3af1871\\\"\"" pod="openstack/tempest-tests-tempest" podUID="068a067f-bb12-4a63-a3ed-7eb05da0ca52" Feb 01 09:50:36 crc kubenswrapper[5127]: I0201 09:50:36.740816 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:50:36 crc kubenswrapper[5127]: I0201 09:50:36.741486 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:50:38 crc kubenswrapper[5127]: I0201 09:50:38.617507 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 01 09:50:40 crc kubenswrapper[5127]: I0201 09:50:40.750213 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"068a067f-bb12-4a63-a3ed-7eb05da0ca52","Type":"ContainerStarted","Data":"6773599dfdea5689480cd11f1c215fa9fa1892c25678ce82b49cb17238b1482a"} Feb 01 09:50:40 crc kubenswrapper[5127]: I0201 09:50:40.785851 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.815868842 podStartE2EDuration="1m5.785831165s" podCreationTimestamp="2026-02-01 09:49:35 +0000 UTC" firstStartedPulling="2026-02-01 09:49:37.643930213 +0000 UTC m=+10928.129832576" lastFinishedPulling="2026-02-01 09:50:38.613892506 +0000 UTC m=+10989.099794899" observedRunningTime="2026-02-01 09:50:40.777433109 +0000 UTC m=+10991.263335482" watchObservedRunningTime="2026-02-01 09:50:40.785831165 +0000 UTC m=+10991.271733528" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.619098 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n9vn7"] Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.622327 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.630948 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n9vn7"] Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.645276 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-catalog-content\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.645434 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kjt\" (UniqueName: \"kubernetes.io/projected/76409d94-1412-40dd-af10-1aa6e8bda465-kube-api-access-k2kjt\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.645475 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-utilities\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.747519 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-catalog-content\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.747710 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kjt\" (UniqueName: \"kubernetes.io/projected/76409d94-1412-40dd-af10-1aa6e8bda465-kube-api-access-k2kjt\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.747760 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-utilities\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.748405 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-utilities\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.748709 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-catalog-content\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.767710 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kjt\" (UniqueName: \"kubernetes.io/projected/76409d94-1412-40dd-af10-1aa6e8bda465-kube-api-access-k2kjt\") pod \"community-operators-n9vn7\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:56 crc kubenswrapper[5127]: I0201 09:50:56.941850 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:50:58 crc kubenswrapper[5127]: I0201 09:50:58.250938 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n9vn7"] Feb 01 09:50:58 crc kubenswrapper[5127]: I0201 09:50:58.994066 5127 generic.go:334] "Generic (PLEG): container finished" podID="76409d94-1412-40dd-af10-1aa6e8bda465" containerID="ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c" exitCode=0 Feb 01 09:50:58 crc kubenswrapper[5127]: I0201 09:50:58.994172 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9vn7" event={"ID":"76409d94-1412-40dd-af10-1aa6e8bda465","Type":"ContainerDied","Data":"ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c"} Feb 01 09:50:58 crc kubenswrapper[5127]: I0201 09:50:58.994421 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9vn7" event={"ID":"76409d94-1412-40dd-af10-1aa6e8bda465","Type":"ContainerStarted","Data":"f0bca6ac61b51255a294ac5bb8441035000e32d735c11ddf5a57a6a45da1c4ee"} Feb 01 09:51:02 crc kubenswrapper[5127]: I0201 09:51:02.031715 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9vn7" event={"ID":"76409d94-1412-40dd-af10-1aa6e8bda465","Type":"ContainerStarted","Data":"7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9"} Feb 01 09:51:06 crc kubenswrapper[5127]: I0201 09:51:06.741098 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:51:06 crc kubenswrapper[5127]: I0201 09:51:06.741845 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:51:06 crc kubenswrapper[5127]: I0201 09:51:06.891838 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xcnbb"] Feb 01 09:51:06 crc kubenswrapper[5127]: I0201 09:51:06.894699 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:06 crc kubenswrapper[5127]: I0201 09:51:06.904200 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xcnbb"] Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.084662 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wjb8\" (UniqueName: \"kubernetes.io/projected/3d5adea3-b70d-4465-b632-38556dcde84f-kube-api-access-5wjb8\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.085052 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-catalog-content\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.085257 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-utilities\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.188047 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-utilities\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.188185 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wjb8\" (UniqueName: \"kubernetes.io/projected/3d5adea3-b70d-4465-b632-38556dcde84f-kube-api-access-5wjb8\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.188280 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-catalog-content\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.188534 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-utilities\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.188695 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-catalog-content\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.209335 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wjb8\" (UniqueName: \"kubernetes.io/projected/3d5adea3-b70d-4465-b632-38556dcde84f-kube-api-access-5wjb8\") pod \"redhat-marketplace-xcnbb\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.236036 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:07 crc kubenswrapper[5127]: I0201 09:51:07.776451 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xcnbb"] Feb 01 09:51:08 crc kubenswrapper[5127]: I0201 09:51:08.124840 5127 generic.go:334] "Generic (PLEG): container finished" podID="3d5adea3-b70d-4465-b632-38556dcde84f" containerID="5517d1df4bc621e1554247130fa2ba57e948a8aea026a67380e22b0184897ea5" exitCode=0 Feb 01 09:51:08 crc kubenswrapper[5127]: I0201 09:51:08.124896 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xcnbb" event={"ID":"3d5adea3-b70d-4465-b632-38556dcde84f","Type":"ContainerDied","Data":"5517d1df4bc621e1554247130fa2ba57e948a8aea026a67380e22b0184897ea5"} Feb 01 09:51:08 crc kubenswrapper[5127]: I0201 09:51:08.124920 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xcnbb" event={"ID":"3d5adea3-b70d-4465-b632-38556dcde84f","Type":"ContainerStarted","Data":"c61835cea19caa2a5f7ad383b11cdeda23fbca1a9726a11215621a20487cc4e1"} Feb 01 09:51:08 crc kubenswrapper[5127]: I0201 09:51:08.134180 5127 generic.go:334] "Generic (PLEG): container finished" podID="76409d94-1412-40dd-af10-1aa6e8bda465" containerID="7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9" exitCode=0 Feb 01 09:51:08 crc kubenswrapper[5127]: I0201 09:51:08.134279 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9vn7" event={"ID":"76409d94-1412-40dd-af10-1aa6e8bda465","Type":"ContainerDied","Data":"7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9"} Feb 01 09:51:10 crc kubenswrapper[5127]: I0201 09:51:10.174426 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9vn7" event={"ID":"76409d94-1412-40dd-af10-1aa6e8bda465","Type":"ContainerStarted","Data":"9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca"} Feb 01 09:51:10 crc kubenswrapper[5127]: I0201 09:51:10.177476 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xcnbb" event={"ID":"3d5adea3-b70d-4465-b632-38556dcde84f","Type":"ContainerStarted","Data":"67589d7ae1f34b9e61ba05e288b3517dda1bd1f73338a126d6c6985662f38772"} Feb 01 09:51:10 crc kubenswrapper[5127]: I0201 09:51:10.202217 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n9vn7" podStartSLOduration=4.106433016 podStartE2EDuration="14.202201278s" podCreationTimestamp="2026-02-01 09:50:56 +0000 UTC" firstStartedPulling="2026-02-01 09:50:58.997482034 +0000 UTC m=+11009.483384397" lastFinishedPulling="2026-02-01 09:51:09.093250256 +0000 UTC m=+11019.579152659" observedRunningTime="2026-02-01 09:51:10.196008381 +0000 UTC m=+11020.681910744" watchObservedRunningTime="2026-02-01 09:51:10.202201278 +0000 UTC m=+11020.688103641" Feb 01 09:51:15 crc kubenswrapper[5127]: I0201 09:51:15.238497 5127 generic.go:334] "Generic (PLEG): container finished" podID="3d5adea3-b70d-4465-b632-38556dcde84f" containerID="67589d7ae1f34b9e61ba05e288b3517dda1bd1f73338a126d6c6985662f38772" exitCode=0 Feb 01 09:51:15 crc kubenswrapper[5127]: I0201 09:51:15.238556 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xcnbb" event={"ID":"3d5adea3-b70d-4465-b632-38556dcde84f","Type":"ContainerDied","Data":"67589d7ae1f34b9e61ba05e288b3517dda1bd1f73338a126d6c6985662f38772"} Feb 01 09:51:16 crc kubenswrapper[5127]: I0201 09:51:16.258004 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xcnbb" event={"ID":"3d5adea3-b70d-4465-b632-38556dcde84f","Type":"ContainerStarted","Data":"d859562cf2b8866edd07d16d87fbb0a48cfbb4672ab45ce99508ee0493774015"} Feb 01 09:51:16 crc kubenswrapper[5127]: I0201 09:51:16.293919 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xcnbb" podStartSLOduration=2.644412156 podStartE2EDuration="10.29389582s" podCreationTimestamp="2026-02-01 09:51:06 +0000 UTC" firstStartedPulling="2026-02-01 09:51:08.126938889 +0000 UTC m=+11018.612841252" lastFinishedPulling="2026-02-01 09:51:15.776422553 +0000 UTC m=+11026.262324916" observedRunningTime="2026-02-01 09:51:16.278761603 +0000 UTC m=+11026.764663976" watchObservedRunningTime="2026-02-01 09:51:16.29389582 +0000 UTC m=+11026.779798193" Feb 01 09:51:16 crc kubenswrapper[5127]: I0201 09:51:16.942840 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:51:16 crc kubenswrapper[5127]: I0201 09:51:16.944781 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:51:17 crc kubenswrapper[5127]: I0201 09:51:17.009953 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:51:17 crc kubenswrapper[5127]: I0201 09:51:17.236349 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:17 crc kubenswrapper[5127]: I0201 09:51:17.236408 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:17 crc kubenswrapper[5127]: I0201 09:51:17.316682 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:51:17 crc kubenswrapper[5127]: I0201 09:51:17.501178 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n9vn7"] Feb 01 09:51:18 crc kubenswrapper[5127]: I0201 09:51:18.306854 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xcnbb" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="registry-server" probeResult="failure" output=< Feb 01 09:51:18 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:51:18 crc kubenswrapper[5127]: > Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.286031 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n9vn7" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="registry-server" containerID="cri-o://9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca" gracePeriod=2 Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.764014 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.787736 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-utilities\") pod \"76409d94-1412-40dd-af10-1aa6e8bda465\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.788227 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-catalog-content\") pod \"76409d94-1412-40dd-af10-1aa6e8bda465\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.788497 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-utilities" (OuterVolumeSpecName: "utilities") pod "76409d94-1412-40dd-af10-1aa6e8bda465" (UID: "76409d94-1412-40dd-af10-1aa6e8bda465"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.788800 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2kjt\" (UniqueName: \"kubernetes.io/projected/76409d94-1412-40dd-af10-1aa6e8bda465-kube-api-access-k2kjt\") pod \"76409d94-1412-40dd-af10-1aa6e8bda465\" (UID: \"76409d94-1412-40dd-af10-1aa6e8bda465\") " Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.793483 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.798426 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76409d94-1412-40dd-af10-1aa6e8bda465-kube-api-access-k2kjt" (OuterVolumeSpecName: "kube-api-access-k2kjt") pod "76409d94-1412-40dd-af10-1aa6e8bda465" (UID: "76409d94-1412-40dd-af10-1aa6e8bda465"). InnerVolumeSpecName "kube-api-access-k2kjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.870820 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76409d94-1412-40dd-af10-1aa6e8bda465" (UID: "76409d94-1412-40dd-af10-1aa6e8bda465"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.896192 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76409d94-1412-40dd-af10-1aa6e8bda465-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:51:19 crc kubenswrapper[5127]: I0201 09:51:19.896233 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2kjt\" (UniqueName: \"kubernetes.io/projected/76409d94-1412-40dd-af10-1aa6e8bda465-kube-api-access-k2kjt\") on node \"crc\" DevicePath \"\"" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.300994 5127 generic.go:334] "Generic (PLEG): container finished" podID="76409d94-1412-40dd-af10-1aa6e8bda465" containerID="9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca" exitCode=0 Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.302173 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9vn7" event={"ID":"76409d94-1412-40dd-af10-1aa6e8bda465","Type":"ContainerDied","Data":"9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca"} Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.302292 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9vn7" event={"ID":"76409d94-1412-40dd-af10-1aa6e8bda465","Type":"ContainerDied","Data":"f0bca6ac61b51255a294ac5bb8441035000e32d735c11ddf5a57a6a45da1c4ee"} Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.302427 5127 scope.go:117] "RemoveContainer" containerID="9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.302717 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9vn7" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.339901 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n9vn7"] Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.349364 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n9vn7"] Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.351258 5127 scope.go:117] "RemoveContainer" containerID="7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.700866 5127 scope.go:117] "RemoveContainer" containerID="ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.748617 5127 scope.go:117] "RemoveContainer" containerID="9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca" Feb 01 09:51:20 crc kubenswrapper[5127]: E0201 09:51:20.749212 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca\": container with ID starting with 9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca not found: ID does not exist" containerID="9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.749264 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca"} err="failed to get container status \"9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca\": rpc error: code = NotFound desc = could not find container \"9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca\": container with ID starting with 9c49d1cb1fd2503e96e13713a9fab544e8a25c873f7eb4442c696cfb676856ca not found: ID does not exist" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.749297 5127 scope.go:117] "RemoveContainer" containerID="7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9" Feb 01 09:51:20 crc kubenswrapper[5127]: E0201 09:51:20.750022 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9\": container with ID starting with 7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9 not found: ID does not exist" containerID="7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.750148 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9"} err="failed to get container status \"7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9\": rpc error: code = NotFound desc = could not find container \"7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9\": container with ID starting with 7d2e6fb26889097c417d39c2eb4cb5d1e061ee211beea3b13bb63d614e07a4c9 not found: ID does not exist" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.750256 5127 scope.go:117] "RemoveContainer" containerID="ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c" Feb 01 09:51:20 crc kubenswrapper[5127]: E0201 09:51:20.750684 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c\": container with ID starting with ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c not found: ID does not exist" containerID="ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c" Feb 01 09:51:20 crc kubenswrapper[5127]: I0201 09:51:20.750720 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c"} err="failed to get container status \"ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c\": rpc error: code = NotFound desc = could not find container \"ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c\": container with ID starting with ea098b533c6f9c87d7428b7b62d9301c7691a3137c9bdf845257798486fa242c not found: ID does not exist" Feb 01 09:51:22 crc kubenswrapper[5127]: I0201 09:51:22.272959 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" path="/var/lib/kubelet/pods/76409d94-1412-40dd-af10-1aa6e8bda465/volumes" Feb 01 09:51:28 crc kubenswrapper[5127]: I0201 09:51:28.315420 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-xcnbb" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="registry-server" probeResult="failure" output=< Feb 01 09:51:28 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 09:51:28 crc kubenswrapper[5127]: > Feb 01 09:51:36 crc kubenswrapper[5127]: I0201 09:51:36.741000 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:51:36 crc kubenswrapper[5127]: I0201 09:51:36.741851 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:51:36 crc kubenswrapper[5127]: I0201 09:51:36.741939 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 09:51:36 crc kubenswrapper[5127]: I0201 09:51:36.743120 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 09:51:36 crc kubenswrapper[5127]: I0201 09:51:36.743206 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" gracePeriod=600 Feb 01 09:51:36 crc kubenswrapper[5127]: E0201 09:51:36.889456 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:51:37 crc kubenswrapper[5127]: I0201 09:51:37.471495 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" exitCode=0 Feb 01 09:51:37 crc kubenswrapper[5127]: I0201 09:51:37.471576 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6"} Feb 01 09:51:37 crc kubenswrapper[5127]: I0201 09:51:37.471869 5127 scope.go:117] "RemoveContainer" containerID="3057b89d670a74eee49f051c22e764de4abf61d8b64af0f4872542456609068f" Feb 01 09:51:37 crc kubenswrapper[5127]: I0201 09:51:37.472593 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:51:37 crc kubenswrapper[5127]: E0201 09:51:37.472887 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:51:37 crc kubenswrapper[5127]: I0201 09:51:37.724920 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:37 crc kubenswrapper[5127]: I0201 09:51:37.786561 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:38 crc kubenswrapper[5127]: I0201 09:51:38.093638 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xcnbb"] Feb 01 09:51:39 crc kubenswrapper[5127]: I0201 09:51:39.492755 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xcnbb" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="registry-server" containerID="cri-o://d859562cf2b8866edd07d16d87fbb0a48cfbb4672ab45ce99508ee0493774015" gracePeriod=2 Feb 01 09:51:40 crc kubenswrapper[5127]: I0201 09:51:40.507822 5127 generic.go:334] "Generic (PLEG): container finished" podID="3d5adea3-b70d-4465-b632-38556dcde84f" containerID="d859562cf2b8866edd07d16d87fbb0a48cfbb4672ab45ce99508ee0493774015" exitCode=0 Feb 01 09:51:40 crc kubenswrapper[5127]: I0201 09:51:40.507900 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xcnbb" event={"ID":"3d5adea3-b70d-4465-b632-38556dcde84f","Type":"ContainerDied","Data":"d859562cf2b8866edd07d16d87fbb0a48cfbb4672ab45ce99508ee0493774015"} Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.139278 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.297166 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wjb8\" (UniqueName: \"kubernetes.io/projected/3d5adea3-b70d-4465-b632-38556dcde84f-kube-api-access-5wjb8\") pod \"3d5adea3-b70d-4465-b632-38556dcde84f\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.297306 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-utilities\") pod \"3d5adea3-b70d-4465-b632-38556dcde84f\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.297442 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-catalog-content\") pod \"3d5adea3-b70d-4465-b632-38556dcde84f\" (UID: \"3d5adea3-b70d-4465-b632-38556dcde84f\") " Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.298075 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-utilities" (OuterVolumeSpecName: "utilities") pod "3d5adea3-b70d-4465-b632-38556dcde84f" (UID: "3d5adea3-b70d-4465-b632-38556dcde84f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.304716 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5adea3-b70d-4465-b632-38556dcde84f-kube-api-access-5wjb8" (OuterVolumeSpecName: "kube-api-access-5wjb8") pod "3d5adea3-b70d-4465-b632-38556dcde84f" (UID: "3d5adea3-b70d-4465-b632-38556dcde84f"). InnerVolumeSpecName "kube-api-access-5wjb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.321302 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d5adea3-b70d-4465-b632-38556dcde84f" (UID: "3d5adea3-b70d-4465-b632-38556dcde84f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.401364 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.401423 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wjb8\" (UniqueName: \"kubernetes.io/projected/3d5adea3-b70d-4465-b632-38556dcde84f-kube-api-access-5wjb8\") on node \"crc\" DevicePath \"\"" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.401437 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5adea3-b70d-4465-b632-38556dcde84f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.585855 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xcnbb" event={"ID":"3d5adea3-b70d-4465-b632-38556dcde84f","Type":"ContainerDied","Data":"c61835cea19caa2a5f7ad383b11cdeda23fbca1a9726a11215621a20487cc4e1"} Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.585922 5127 scope.go:117] "RemoveContainer" containerID="d859562cf2b8866edd07d16d87fbb0a48cfbb4672ab45ce99508ee0493774015" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.585973 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xcnbb" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.645443 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xcnbb"] Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.646618 5127 scope.go:117] "RemoveContainer" containerID="67589d7ae1f34b9e61ba05e288b3517dda1bd1f73338a126d6c6985662f38772" Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.669303 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xcnbb"] Feb 01 09:51:41 crc kubenswrapper[5127]: I0201 09:51:41.687855 5127 scope.go:117] "RemoveContainer" containerID="5517d1df4bc621e1554247130fa2ba57e948a8aea026a67380e22b0184897ea5" Feb 01 09:51:41 crc kubenswrapper[5127]: E0201 09:51:41.748417 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d5adea3_b70d_4465_b632_38556dcde84f.slice\": RecentStats: unable to find data in memory cache]" Feb 01 09:51:42 crc kubenswrapper[5127]: I0201 09:51:42.247223 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" path="/var/lib/kubelet/pods/3d5adea3-b70d-4465-b632-38556dcde84f/volumes" Feb 01 09:51:48 crc kubenswrapper[5127]: I0201 09:51:48.235737 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:51:48 crc kubenswrapper[5127]: E0201 09:51:48.236835 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:52:01 crc kubenswrapper[5127]: I0201 09:52:01.236674 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:52:01 crc kubenswrapper[5127]: E0201 09:52:01.240474 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:52:14 crc kubenswrapper[5127]: I0201 09:52:14.235433 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:52:14 crc kubenswrapper[5127]: E0201 09:52:14.236186 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:52:29 crc kubenswrapper[5127]: I0201 09:52:29.237251 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:52:29 crc kubenswrapper[5127]: E0201 09:52:29.238064 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:52:44 crc kubenswrapper[5127]: I0201 09:52:44.236234 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:52:44 crc kubenswrapper[5127]: E0201 09:52:44.236969 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:52:59 crc kubenswrapper[5127]: I0201 09:52:59.235063 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:52:59 crc kubenswrapper[5127]: E0201 09:52:59.235757 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:53:11 crc kubenswrapper[5127]: I0201 09:53:11.238235 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:53:11 crc kubenswrapper[5127]: E0201 09:53:11.239442 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:53:25 crc kubenswrapper[5127]: I0201 09:53:25.236331 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:53:25 crc kubenswrapper[5127]: E0201 09:53:25.237928 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:53:38 crc kubenswrapper[5127]: I0201 09:53:38.243802 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:53:38 crc kubenswrapper[5127]: E0201 09:53:38.244683 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:53:53 crc kubenswrapper[5127]: I0201 09:53:53.236026 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:53:53 crc kubenswrapper[5127]: E0201 09:53:53.236994 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:54:08 crc kubenswrapper[5127]: I0201 09:54:08.235385 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:54:08 crc kubenswrapper[5127]: E0201 09:54:08.236066 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:54:21 crc kubenswrapper[5127]: I0201 09:54:21.235270 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:54:21 crc kubenswrapper[5127]: E0201 09:54:21.236062 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:54:33 crc kubenswrapper[5127]: I0201 09:54:33.236231 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:54:33 crc kubenswrapper[5127]: E0201 09:54:33.237215 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:54:46 crc kubenswrapper[5127]: I0201 09:54:46.236133 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:54:46 crc kubenswrapper[5127]: E0201 09:54:46.237141 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:54:59 crc kubenswrapper[5127]: I0201 09:54:59.235508 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:54:59 crc kubenswrapper[5127]: E0201 09:54:59.236372 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:55:14 crc kubenswrapper[5127]: I0201 09:55:14.236225 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:55:14 crc kubenswrapper[5127]: E0201 09:55:14.237117 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:55:28 crc kubenswrapper[5127]: I0201 09:55:28.235873 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:55:28 crc kubenswrapper[5127]: E0201 09:55:28.236635 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:55:43 crc kubenswrapper[5127]: I0201 09:55:43.236599 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:55:43 crc kubenswrapper[5127]: E0201 09:55:43.237547 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:55:55 crc kubenswrapper[5127]: I0201 09:55:55.235571 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:55:55 crc kubenswrapper[5127]: E0201 09:55:55.236265 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:56:07 crc kubenswrapper[5127]: I0201 09:56:07.236518 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:56:07 crc kubenswrapper[5127]: E0201 09:56:07.237217 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:56:20 crc kubenswrapper[5127]: I0201 09:56:20.249436 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:56:20 crc kubenswrapper[5127]: E0201 09:56:20.250504 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:56:35 crc kubenswrapper[5127]: I0201 09:56:35.236686 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:56:35 crc kubenswrapper[5127]: E0201 09:56:35.237336 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 09:56:46 crc kubenswrapper[5127]: I0201 09:56:46.235556 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 09:56:46 crc kubenswrapper[5127]: I0201 09:56:46.946260 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"9454e01d3fc6bb7a917a7708fe888102b07cb8cf9719add9b1516c629ea60e75"} Feb 01 09:59:06 crc kubenswrapper[5127]: I0201 09:59:06.741381 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:59:06 crc kubenswrapper[5127]: I0201 09:59:06.741911 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 09:59:36 crc kubenswrapper[5127]: I0201 09:59:36.740758 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 09:59:36 crc kubenswrapper[5127]: I0201 09:59:36.742278 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.215475 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m"] Feb 01 10:00:00 crc kubenswrapper[5127]: E0201 10:00:00.216373 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="registry-server" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216385 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="registry-server" Feb 01 10:00:00 crc kubenswrapper[5127]: E0201 10:00:00.216415 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="extract-content" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216421 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="extract-content" Feb 01 10:00:00 crc kubenswrapper[5127]: E0201 10:00:00.216434 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="registry-server" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216440 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="registry-server" Feb 01 10:00:00 crc kubenswrapper[5127]: E0201 10:00:00.216449 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="extract-content" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216455 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="extract-content" Feb 01 10:00:00 crc kubenswrapper[5127]: E0201 10:00:00.216471 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="extract-utilities" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216477 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="extract-utilities" Feb 01 10:00:00 crc kubenswrapper[5127]: E0201 10:00:00.216494 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="extract-utilities" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216500 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="extract-utilities" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216685 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5adea3-b70d-4465-b632-38556dcde84f" containerName="registry-server" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.216697 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="76409d94-1412-40dd-af10-1aa6e8bda465" containerName="registry-server" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.217383 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.220752 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.225631 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.277942 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m"] Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.305507 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2whn\" (UniqueName: \"kubernetes.io/projected/22250279-6ffa-4050-847a-5fc6d2316b3c-kube-api-access-h2whn\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.305556 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22250279-6ffa-4050-847a-5fc6d2316b3c-secret-volume\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.305691 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22250279-6ffa-4050-847a-5fc6d2316b3c-config-volume\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.406508 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22250279-6ffa-4050-847a-5fc6d2316b3c-config-volume\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.406660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2whn\" (UniqueName: \"kubernetes.io/projected/22250279-6ffa-4050-847a-5fc6d2316b3c-kube-api-access-h2whn\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.406711 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22250279-6ffa-4050-847a-5fc6d2316b3c-secret-volume\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.407418 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22250279-6ffa-4050-847a-5fc6d2316b3c-config-volume\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.413431 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22250279-6ffa-4050-847a-5fc6d2316b3c-secret-volume\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.423651 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2whn\" (UniqueName: \"kubernetes.io/projected/22250279-6ffa-4050-847a-5fc6d2316b3c-kube-api-access-h2whn\") pod \"collect-profiles-29499000-lxn6m\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:00 crc kubenswrapper[5127]: I0201 10:00:00.535451 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:01 crc kubenswrapper[5127]: I0201 10:00:01.399906 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m"] Feb 01 10:00:01 crc kubenswrapper[5127]: W0201 10:00:01.404301 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22250279_6ffa_4050_847a_5fc6d2316b3c.slice/crio-59a66b0dce4c2f49013f47182f71460c6b84a7872c3ed0985822f421589e2416 WatchSource:0}: Error finding container 59a66b0dce4c2f49013f47182f71460c6b84a7872c3ed0985822f421589e2416: Status 404 returned error can't find the container with id 59a66b0dce4c2f49013f47182f71460c6b84a7872c3ed0985822f421589e2416 Feb 01 10:00:02 crc kubenswrapper[5127]: I0201 10:00:02.272743 5127 generic.go:334] "Generic (PLEG): container finished" podID="22250279-6ffa-4050-847a-5fc6d2316b3c" containerID="ce3e2535d9ccc9c914217cd2ff5644277b78ba5f557645cc6a136acda148a4c6" exitCode=0 Feb 01 10:00:02 crc kubenswrapper[5127]: I0201 10:00:02.272810 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" event={"ID":"22250279-6ffa-4050-847a-5fc6d2316b3c","Type":"ContainerDied","Data":"ce3e2535d9ccc9c914217cd2ff5644277b78ba5f557645cc6a136acda148a4c6"} Feb 01 10:00:02 crc kubenswrapper[5127]: I0201 10:00:02.273194 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" event={"ID":"22250279-6ffa-4050-847a-5fc6d2316b3c","Type":"ContainerStarted","Data":"59a66b0dce4c2f49013f47182f71460c6b84a7872c3ed0985822f421589e2416"} Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.010339 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.182470 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2whn\" (UniqueName: \"kubernetes.io/projected/22250279-6ffa-4050-847a-5fc6d2316b3c-kube-api-access-h2whn\") pod \"22250279-6ffa-4050-847a-5fc6d2316b3c\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.182612 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22250279-6ffa-4050-847a-5fc6d2316b3c-config-volume\") pod \"22250279-6ffa-4050-847a-5fc6d2316b3c\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.182688 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22250279-6ffa-4050-847a-5fc6d2316b3c-secret-volume\") pod \"22250279-6ffa-4050-847a-5fc6d2316b3c\" (UID: \"22250279-6ffa-4050-847a-5fc6d2316b3c\") " Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.183672 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22250279-6ffa-4050-847a-5fc6d2316b3c-config-volume" (OuterVolumeSpecName: "config-volume") pod "22250279-6ffa-4050-847a-5fc6d2316b3c" (UID: "22250279-6ffa-4050-847a-5fc6d2316b3c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.200972 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22250279-6ffa-4050-847a-5fc6d2316b3c-kube-api-access-h2whn" (OuterVolumeSpecName: "kube-api-access-h2whn") pod "22250279-6ffa-4050-847a-5fc6d2316b3c" (UID: "22250279-6ffa-4050-847a-5fc6d2316b3c"). InnerVolumeSpecName "kube-api-access-h2whn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.205682 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22250279-6ffa-4050-847a-5fc6d2316b3c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "22250279-6ffa-4050-847a-5fc6d2316b3c" (UID: "22250279-6ffa-4050-847a-5fc6d2316b3c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.284706 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2whn\" (UniqueName: \"kubernetes.io/projected/22250279-6ffa-4050-847a-5fc6d2316b3c-kube-api-access-h2whn\") on node \"crc\" DevicePath \"\"" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.284950 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22250279-6ffa-4050-847a-5fc6d2316b3c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.284959 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/22250279-6ffa-4050-847a-5fc6d2316b3c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.299316 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" event={"ID":"22250279-6ffa-4050-847a-5fc6d2316b3c","Type":"ContainerDied","Data":"59a66b0dce4c2f49013f47182f71460c6b84a7872c3ed0985822f421589e2416"} Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.299391 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59a66b0dce4c2f49013f47182f71460c6b84a7872c3ed0985822f421589e2416" Feb 01 10:00:04 crc kubenswrapper[5127]: I0201 10:00:04.299454 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499000-lxn6m" Feb 01 10:00:05 crc kubenswrapper[5127]: I0201 10:00:05.118220 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw"] Feb 01 10:00:05 crc kubenswrapper[5127]: I0201 10:00:05.130902 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498955-g8qbw"] Feb 01 10:00:06 crc kubenswrapper[5127]: I0201 10:00:06.248240 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c9ea12-9b4a-4c88-b0a1-810b48999166" path="/var/lib/kubelet/pods/02c9ea12-9b4a-4c88-b0a1-810b48999166/volumes" Feb 01 10:00:06 crc kubenswrapper[5127]: I0201 10:00:06.741955 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:00:06 crc kubenswrapper[5127]: I0201 10:00:06.742218 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:00:06 crc kubenswrapper[5127]: I0201 10:00:06.742253 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 10:00:06 crc kubenswrapper[5127]: I0201 10:00:06.742687 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9454e01d3fc6bb7a917a7708fe888102b07cb8cf9719add9b1516c629ea60e75"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 10:00:06 crc kubenswrapper[5127]: I0201 10:00:06.742733 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://9454e01d3fc6bb7a917a7708fe888102b07cb8cf9719add9b1516c629ea60e75" gracePeriod=600 Feb 01 10:00:07 crc kubenswrapper[5127]: I0201 10:00:07.345745 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="9454e01d3fc6bb7a917a7708fe888102b07cb8cf9719add9b1516c629ea60e75" exitCode=0 Feb 01 10:00:07 crc kubenswrapper[5127]: I0201 10:00:07.345842 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"9454e01d3fc6bb7a917a7708fe888102b07cb8cf9719add9b1516c629ea60e75"} Feb 01 10:00:07 crc kubenswrapper[5127]: I0201 10:00:07.346093 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be"} Feb 01 10:00:07 crc kubenswrapper[5127]: I0201 10:00:07.346133 5127 scope.go:117] "RemoveContainer" containerID="a3911cfeb27aecf1668553f126293acd6bc76856c3134f7d8f5ce001d423a8c6" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.654307 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tf6x5"] Feb 01 10:00:41 crc kubenswrapper[5127]: E0201 10:00:41.656713 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22250279-6ffa-4050-847a-5fc6d2316b3c" containerName="collect-profiles" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.656760 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="22250279-6ffa-4050-847a-5fc6d2316b3c" containerName="collect-profiles" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.657278 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="22250279-6ffa-4050-847a-5fc6d2316b3c" containerName="collect-profiles" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.674499 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.696669 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf6x5"] Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.780612 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-catalog-content\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.781039 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfsxn\" (UniqueName: \"kubernetes.io/projected/11ee7668-319f-4222-b4e7-b39390c136a4-kube-api-access-tfsxn\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.781217 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-utilities\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.883323 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-catalog-content\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.883414 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfsxn\" (UniqueName: \"kubernetes.io/projected/11ee7668-319f-4222-b4e7-b39390c136a4-kube-api-access-tfsxn\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.883487 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-utilities\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.883942 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-catalog-content\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.884015 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-utilities\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:41 crc kubenswrapper[5127]: I0201 10:00:41.911735 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfsxn\" (UniqueName: \"kubernetes.io/projected/11ee7668-319f-4222-b4e7-b39390c136a4-kube-api-access-tfsxn\") pod \"certified-operators-tf6x5\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:42 crc kubenswrapper[5127]: I0201 10:00:42.008447 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:42 crc kubenswrapper[5127]: I0201 10:00:42.535847 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tf6x5"] Feb 01 10:00:42 crc kubenswrapper[5127]: I0201 10:00:42.701683 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf6x5" event={"ID":"11ee7668-319f-4222-b4e7-b39390c136a4","Type":"ContainerStarted","Data":"620c90f8ac8e2151ba826cb72410863d14eab91919dbe654bb052d06c6dbf0bc"} Feb 01 10:00:43 crc kubenswrapper[5127]: I0201 10:00:43.714486 5127 generic.go:334] "Generic (PLEG): container finished" podID="11ee7668-319f-4222-b4e7-b39390c136a4" containerID="39dc18e5c47319f095d9fbfa97b01b93a1ea952ad3cab2c887720d97d396de6d" exitCode=0 Feb 01 10:00:43 crc kubenswrapper[5127]: I0201 10:00:43.714730 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf6x5" event={"ID":"11ee7668-319f-4222-b4e7-b39390c136a4","Type":"ContainerDied","Data":"39dc18e5c47319f095d9fbfa97b01b93a1ea952ad3cab2c887720d97d396de6d"} Feb 01 10:00:43 crc kubenswrapper[5127]: I0201 10:00:43.716976 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 10:00:44 crc kubenswrapper[5127]: I0201 10:00:44.726985 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf6x5" event={"ID":"11ee7668-319f-4222-b4e7-b39390c136a4","Type":"ContainerStarted","Data":"607fd6ef35a6583317409ffe0eb9244feec879d8e105b302dcf5dd214129b234"} Feb 01 10:00:46 crc kubenswrapper[5127]: E0201 10:00:46.208205 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11ee7668_319f_4222_b4e7_b39390c136a4.slice/crio-607fd6ef35a6583317409ffe0eb9244feec879d8e105b302dcf5dd214129b234.scope\": RecentStats: unable to find data in memory cache]" Feb 01 10:00:46 crc kubenswrapper[5127]: I0201 10:00:46.750848 5127 generic.go:334] "Generic (PLEG): container finished" podID="11ee7668-319f-4222-b4e7-b39390c136a4" containerID="607fd6ef35a6583317409ffe0eb9244feec879d8e105b302dcf5dd214129b234" exitCode=0 Feb 01 10:00:46 crc kubenswrapper[5127]: I0201 10:00:46.750896 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf6x5" event={"ID":"11ee7668-319f-4222-b4e7-b39390c136a4","Type":"ContainerDied","Data":"607fd6ef35a6583317409ffe0eb9244feec879d8e105b302dcf5dd214129b234"} Feb 01 10:00:47 crc kubenswrapper[5127]: I0201 10:00:47.764040 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf6x5" event={"ID":"11ee7668-319f-4222-b4e7-b39390c136a4","Type":"ContainerStarted","Data":"8f1eb6993b9601dcd7af70eab682241a4ffc4ddecd3719e7edfd0fd67c9830f4"} Feb 01 10:00:47 crc kubenswrapper[5127]: I0201 10:00:47.779885 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tf6x5" podStartSLOduration=3.353080723 podStartE2EDuration="6.77986743s" podCreationTimestamp="2026-02-01 10:00:41 +0000 UTC" firstStartedPulling="2026-02-01 10:00:43.716744343 +0000 UTC m=+11594.202646706" lastFinishedPulling="2026-02-01 10:00:47.14353105 +0000 UTC m=+11597.629433413" observedRunningTime="2026-02-01 10:00:47.77872855 +0000 UTC m=+11598.264630973" watchObservedRunningTime="2026-02-01 10:00:47.77986743 +0000 UTC m=+11598.265769793" Feb 01 10:00:51 crc kubenswrapper[5127]: I0201 10:00:51.647021 5127 scope.go:117] "RemoveContainer" containerID="50e30ed260e77de219c101046127e2ac6a35c03f8e019ed982eaf2f1599714c8" Feb 01 10:00:52 crc kubenswrapper[5127]: I0201 10:00:52.008799 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:52 crc kubenswrapper[5127]: I0201 10:00:52.009184 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:52 crc kubenswrapper[5127]: I0201 10:00:52.072168 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:52 crc kubenswrapper[5127]: I0201 10:00:52.884088 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:52 crc kubenswrapper[5127]: I0201 10:00:52.949907 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tf6x5"] Feb 01 10:00:54 crc kubenswrapper[5127]: I0201 10:00:54.832633 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tf6x5" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="registry-server" containerID="cri-o://8f1eb6993b9601dcd7af70eab682241a4ffc4ddecd3719e7edfd0fd67c9830f4" gracePeriod=2 Feb 01 10:00:55 crc kubenswrapper[5127]: I0201 10:00:55.854616 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf6x5" event={"ID":"11ee7668-319f-4222-b4e7-b39390c136a4","Type":"ContainerDied","Data":"8f1eb6993b9601dcd7af70eab682241a4ffc4ddecd3719e7edfd0fd67c9830f4"} Feb 01 10:00:55 crc kubenswrapper[5127]: I0201 10:00:55.854652 5127 generic.go:334] "Generic (PLEG): container finished" podID="11ee7668-319f-4222-b4e7-b39390c136a4" containerID="8f1eb6993b9601dcd7af70eab682241a4ffc4ddecd3719e7edfd0fd67c9830f4" exitCode=0 Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.093390 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.289904 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-utilities\") pod \"11ee7668-319f-4222-b4e7-b39390c136a4\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.290377 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-catalog-content\") pod \"11ee7668-319f-4222-b4e7-b39390c136a4\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.290407 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfsxn\" (UniqueName: \"kubernetes.io/projected/11ee7668-319f-4222-b4e7-b39390c136a4-kube-api-access-tfsxn\") pod \"11ee7668-319f-4222-b4e7-b39390c136a4\" (UID: \"11ee7668-319f-4222-b4e7-b39390c136a4\") " Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.291219 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-utilities" (OuterVolumeSpecName: "utilities") pod "11ee7668-319f-4222-b4e7-b39390c136a4" (UID: "11ee7668-319f-4222-b4e7-b39390c136a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.291311 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.299714 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ee7668-319f-4222-b4e7-b39390c136a4-kube-api-access-tfsxn" (OuterVolumeSpecName: "kube-api-access-tfsxn") pod "11ee7668-319f-4222-b4e7-b39390c136a4" (UID: "11ee7668-319f-4222-b4e7-b39390c136a4"). InnerVolumeSpecName "kube-api-access-tfsxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.357726 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11ee7668-319f-4222-b4e7-b39390c136a4" (UID: "11ee7668-319f-4222-b4e7-b39390c136a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.393956 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ee7668-319f-4222-b4e7-b39390c136a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.394007 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfsxn\" (UniqueName: \"kubernetes.io/projected/11ee7668-319f-4222-b4e7-b39390c136a4-kube-api-access-tfsxn\") on node \"crc\" DevicePath \"\"" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.869606 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tf6x5" event={"ID":"11ee7668-319f-4222-b4e7-b39390c136a4","Type":"ContainerDied","Data":"620c90f8ac8e2151ba826cb72410863d14eab91919dbe654bb052d06c6dbf0bc"} Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.869954 5127 scope.go:117] "RemoveContainer" containerID="8f1eb6993b9601dcd7af70eab682241a4ffc4ddecd3719e7edfd0fd67c9830f4" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.869652 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tf6x5" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.913928 5127 scope.go:117] "RemoveContainer" containerID="607fd6ef35a6583317409ffe0eb9244feec879d8e105b302dcf5dd214129b234" Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.917077 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tf6x5"] Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.933237 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tf6x5"] Feb 01 10:00:56 crc kubenswrapper[5127]: I0201 10:00:56.968991 5127 scope.go:117] "RemoveContainer" containerID="39dc18e5c47319f095d9fbfa97b01b93a1ea952ad3cab2c887720d97d396de6d" Feb 01 10:00:58 crc kubenswrapper[5127]: I0201 10:00:58.249690 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" path="/var/lib/kubelet/pods/11ee7668-319f-4222-b4e7-b39390c136a4/volumes" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.158983 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29499001-jzqw2"] Feb 01 10:01:00 crc kubenswrapper[5127]: E0201 10:01:00.159701 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="extract-content" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.159713 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="extract-content" Feb 01 10:01:00 crc kubenswrapper[5127]: E0201 10:01:00.159749 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="extract-utilities" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.159756 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="extract-utilities" Feb 01 10:01:00 crc kubenswrapper[5127]: E0201 10:01:00.159775 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="registry-server" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.159781 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="registry-server" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.159950 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ee7668-319f-4222-b4e7-b39390c136a4" containerName="registry-server" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.160659 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.169952 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwvc2\" (UniqueName: \"kubernetes.io/projected/2ff2f742-9eb6-450f-af81-ca38b3654694-kube-api-access-lwvc2\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.170038 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-combined-ca-bundle\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.170091 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-fernet-keys\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.170329 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-config-data\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.172272 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29499001-jzqw2"] Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.273171 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwvc2\" (UniqueName: \"kubernetes.io/projected/2ff2f742-9eb6-450f-af81-ca38b3654694-kube-api-access-lwvc2\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.273447 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-combined-ca-bundle\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.273483 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-fernet-keys\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.273660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-config-data\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.281896 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-combined-ca-bundle\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.284148 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-fernet-keys\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.287502 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-config-data\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.292363 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwvc2\" (UniqueName: \"kubernetes.io/projected/2ff2f742-9eb6-450f-af81-ca38b3654694-kube-api-access-lwvc2\") pod \"keystone-cron-29499001-jzqw2\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.482985 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.745210 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9fllb"] Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.747494 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.757872 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fllb"] Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.884976 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-utilities\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.885187 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-catalog-content\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.885209 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tl9b\" (UniqueName: \"kubernetes.io/projected/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-kube-api-access-8tl9b\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.960049 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29499001-jzqw2"] Feb 01 10:01:00 crc kubenswrapper[5127]: W0201 10:01:00.968699 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff2f742_9eb6_450f_af81_ca38b3654694.slice/crio-fcbcce2fd51700716da6d41c86de1e51ecb0e0912b9038e4f4f4408d7ef533c9 WatchSource:0}: Error finding container fcbcce2fd51700716da6d41c86de1e51ecb0e0912b9038e4f4f4408d7ef533c9: Status 404 returned error can't find the container with id fcbcce2fd51700716da6d41c86de1e51ecb0e0912b9038e4f4f4408d7ef533c9 Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.987055 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-utilities\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.987194 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-catalog-content\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.987219 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tl9b\" (UniqueName: \"kubernetes.io/projected/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-kube-api-access-8tl9b\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.987565 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-catalog-content\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:00 crc kubenswrapper[5127]: I0201 10:01:00.987828 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-utilities\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.003429 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tl9b\" (UniqueName: \"kubernetes.io/projected/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-kube-api-access-8tl9b\") pod \"community-operators-9fllb\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.066280 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.614862 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9fllb"] Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.940005 5127 generic.go:334] "Generic (PLEG): container finished" podID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerID="ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f" exitCode=0 Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.940229 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fllb" event={"ID":"75e27a5f-ad0b-46ff-9856-02dc095ed3e6","Type":"ContainerDied","Data":"ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f"} Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.940418 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fllb" event={"ID":"75e27a5f-ad0b-46ff-9856-02dc095ed3e6","Type":"ContainerStarted","Data":"43bce1d02ac8c3eebd702d290738b8a1e546a06479e130addcf9372434853fe9"} Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.942270 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29499001-jzqw2" event={"ID":"2ff2f742-9eb6-450f-af81-ca38b3654694","Type":"ContainerStarted","Data":"85e70b9bdafc4fc0ddf1dd574e1d6f9af1a7d07e4a53c737961669a1e416b078"} Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.942310 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29499001-jzqw2" event={"ID":"2ff2f742-9eb6-450f-af81-ca38b3654694","Type":"ContainerStarted","Data":"fcbcce2fd51700716da6d41c86de1e51ecb0e0912b9038e4f4f4408d7ef533c9"} Feb 01 10:01:01 crc kubenswrapper[5127]: I0201 10:01:01.996316 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29499001-jzqw2" podStartSLOduration=1.996293937 podStartE2EDuration="1.996293937s" podCreationTimestamp="2026-02-01 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 10:01:01.979438004 +0000 UTC m=+11612.465340367" watchObservedRunningTime="2026-02-01 10:01:01.996293937 +0000 UTC m=+11612.482196300" Feb 01 10:01:03 crc kubenswrapper[5127]: I0201 10:01:03.950147 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-59q9q"] Feb 01 10:01:03 crc kubenswrapper[5127]: I0201 10:01:03.953621 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:03 crc kubenswrapper[5127]: I0201 10:01:03.968899 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59q9q"] Feb 01 10:01:03 crc kubenswrapper[5127]: I0201 10:01:03.971262 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fllb" event={"ID":"75e27a5f-ad0b-46ff-9856-02dc095ed3e6","Type":"ContainerStarted","Data":"e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf"} Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.151774 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-catalog-content\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.151832 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-utilities\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.152215 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxn4h\" (UniqueName: \"kubernetes.io/projected/32cfa615-c302-4cdd-acfd-80c9fa516190-kube-api-access-hxn4h\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.254270 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxn4h\" (UniqueName: \"kubernetes.io/projected/32cfa615-c302-4cdd-acfd-80c9fa516190-kube-api-access-hxn4h\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.254660 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-catalog-content\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.254819 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-utilities\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.255190 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-catalog-content\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.255441 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-utilities\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.274495 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxn4h\" (UniqueName: \"kubernetes.io/projected/32cfa615-c302-4cdd-acfd-80c9fa516190-kube-api-access-hxn4h\") pod \"redhat-operators-59q9q\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.284285 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.800419 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59q9q"] Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.982237 5127 generic.go:334] "Generic (PLEG): container finished" podID="2ff2f742-9eb6-450f-af81-ca38b3654694" containerID="85e70b9bdafc4fc0ddf1dd574e1d6f9af1a7d07e4a53c737961669a1e416b078" exitCode=0 Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.982307 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29499001-jzqw2" event={"ID":"2ff2f742-9eb6-450f-af81-ca38b3654694","Type":"ContainerDied","Data":"85e70b9bdafc4fc0ddf1dd574e1d6f9af1a7d07e4a53c737961669a1e416b078"} Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.984013 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59q9q" event={"ID":"32cfa615-c302-4cdd-acfd-80c9fa516190","Type":"ContainerStarted","Data":"e23691033b283d18600da5dc48207433fc07bf16349e4269fcc4207f001dfe24"} Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.987451 5127 generic.go:334] "Generic (PLEG): container finished" podID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerID="e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf" exitCode=0 Feb 01 10:01:04 crc kubenswrapper[5127]: I0201 10:01:04.987503 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fllb" event={"ID":"75e27a5f-ad0b-46ff-9856-02dc095ed3e6","Type":"ContainerDied","Data":"e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf"} Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.001948 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fllb" event={"ID":"75e27a5f-ad0b-46ff-9856-02dc095ed3e6","Type":"ContainerStarted","Data":"5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93"} Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.009075 5127 generic.go:334] "Generic (PLEG): container finished" podID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerID="640dfca0d81f8606fd8bee40e4435afe34e9f4af7926901c3572b9818f2948ab" exitCode=0 Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.009136 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59q9q" event={"ID":"32cfa615-c302-4cdd-acfd-80c9fa516190","Type":"ContainerDied","Data":"640dfca0d81f8606fd8bee40e4435afe34e9f4af7926901c3572b9818f2948ab"} Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.036091 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9fllb" podStartSLOduration=2.60147704 podStartE2EDuration="6.036069176s" podCreationTimestamp="2026-02-01 10:01:00 +0000 UTC" firstStartedPulling="2026-02-01 10:01:01.942798598 +0000 UTC m=+11612.428700971" lastFinishedPulling="2026-02-01 10:01:05.377390754 +0000 UTC m=+11615.863293107" observedRunningTime="2026-02-01 10:01:06.025988495 +0000 UTC m=+11616.511890858" watchObservedRunningTime="2026-02-01 10:01:06.036069176 +0000 UTC m=+11616.521971549" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.679236 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.810894 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-fernet-keys\") pod \"2ff2f742-9eb6-450f-af81-ca38b3654694\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.811087 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-combined-ca-bundle\") pod \"2ff2f742-9eb6-450f-af81-ca38b3654694\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.811244 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwvc2\" (UniqueName: \"kubernetes.io/projected/2ff2f742-9eb6-450f-af81-ca38b3654694-kube-api-access-lwvc2\") pod \"2ff2f742-9eb6-450f-af81-ca38b3654694\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.811269 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-config-data\") pod \"2ff2f742-9eb6-450f-af81-ca38b3654694\" (UID: \"2ff2f742-9eb6-450f-af81-ca38b3654694\") " Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.816730 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff2f742-9eb6-450f-af81-ca38b3654694-kube-api-access-lwvc2" (OuterVolumeSpecName: "kube-api-access-lwvc2") pod "2ff2f742-9eb6-450f-af81-ca38b3654694" (UID: "2ff2f742-9eb6-450f-af81-ca38b3654694"). InnerVolumeSpecName "kube-api-access-lwvc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.825347 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2ff2f742-9eb6-450f-af81-ca38b3654694" (UID: "2ff2f742-9eb6-450f-af81-ca38b3654694"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.841808 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff2f742-9eb6-450f-af81-ca38b3654694" (UID: "2ff2f742-9eb6-450f-af81-ca38b3654694"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.864831 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-config-data" (OuterVolumeSpecName: "config-data") pod "2ff2f742-9eb6-450f-af81-ca38b3654694" (UID: "2ff2f742-9eb6-450f-af81-ca38b3654694"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.913297 5127 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.913641 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.913653 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwvc2\" (UniqueName: \"kubernetes.io/projected/2ff2f742-9eb6-450f-af81-ca38b3654694-kube-api-access-lwvc2\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:06 crc kubenswrapper[5127]: I0201 10:01:06.913663 5127 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ff2f742-9eb6-450f-af81-ca38b3654694-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:07 crc kubenswrapper[5127]: I0201 10:01:07.043951 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29499001-jzqw2" Feb 01 10:01:07 crc kubenswrapper[5127]: I0201 10:01:07.045881 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29499001-jzqw2" event={"ID":"2ff2f742-9eb6-450f-af81-ca38b3654694","Type":"ContainerDied","Data":"fcbcce2fd51700716da6d41c86de1e51ecb0e0912b9038e4f4f4408d7ef533c9"} Feb 01 10:01:07 crc kubenswrapper[5127]: I0201 10:01:07.045930 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcbcce2fd51700716da6d41c86de1e51ecb0e0912b9038e4f4f4408d7ef533c9" Feb 01 10:01:08 crc kubenswrapper[5127]: I0201 10:01:08.057924 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59q9q" event={"ID":"32cfa615-c302-4cdd-acfd-80c9fa516190","Type":"ContainerStarted","Data":"2472f9c753216ac814aad492288bd97235d16f9bc4597dc86fdc1ba5edf86780"} Feb 01 10:01:11 crc kubenswrapper[5127]: I0201 10:01:11.067327 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:11 crc kubenswrapper[5127]: I0201 10:01:11.067661 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:12 crc kubenswrapper[5127]: I0201 10:01:12.109803 5127 generic.go:334] "Generic (PLEG): container finished" podID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerID="2472f9c753216ac814aad492288bd97235d16f9bc4597dc86fdc1ba5edf86780" exitCode=0 Feb 01 10:01:12 crc kubenswrapper[5127]: I0201 10:01:12.109853 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59q9q" event={"ID":"32cfa615-c302-4cdd-acfd-80c9fa516190","Type":"ContainerDied","Data":"2472f9c753216ac814aad492288bd97235d16f9bc4597dc86fdc1ba5edf86780"} Feb 01 10:01:12 crc kubenswrapper[5127]: I0201 10:01:12.122858 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9fllb" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="registry-server" probeResult="failure" output=< Feb 01 10:01:12 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 10:01:12 crc kubenswrapper[5127]: > Feb 01 10:01:13 crc kubenswrapper[5127]: I0201 10:01:13.126001 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59q9q" event={"ID":"32cfa615-c302-4cdd-acfd-80c9fa516190","Type":"ContainerStarted","Data":"7e641fa3263af64675546b0ea4fa6bf9da8d2585376bcd591218b84c9a8cb993"} Feb 01 10:01:13 crc kubenswrapper[5127]: I0201 10:01:13.154382 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-59q9q" podStartSLOduration=3.619143863 podStartE2EDuration="10.154355051s" podCreationTimestamp="2026-02-01 10:01:03 +0000 UTC" firstStartedPulling="2026-02-01 10:01:06.010834207 +0000 UTC m=+11616.496736570" lastFinishedPulling="2026-02-01 10:01:12.546045385 +0000 UTC m=+11623.031947758" observedRunningTime="2026-02-01 10:01:13.14317166 +0000 UTC m=+11623.629074023" watchObservedRunningTime="2026-02-01 10:01:13.154355051 +0000 UTC m=+11623.640257444" Feb 01 10:01:14 crc kubenswrapper[5127]: I0201 10:01:14.285854 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:14 crc kubenswrapper[5127]: I0201 10:01:14.285899 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:15 crc kubenswrapper[5127]: I0201 10:01:15.347634 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-59q9q" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="registry-server" probeResult="failure" output=< Feb 01 10:01:15 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 10:01:15 crc kubenswrapper[5127]: > Feb 01 10:01:21 crc kubenswrapper[5127]: I0201 10:01:21.117865 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:21 crc kubenswrapper[5127]: I0201 10:01:21.196933 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:21 crc kubenswrapper[5127]: I0201 10:01:21.355320 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fllb"] Feb 01 10:01:22 crc kubenswrapper[5127]: I0201 10:01:22.209669 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9fllb" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="registry-server" containerID="cri-o://5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93" gracePeriod=2 Feb 01 10:01:22 crc kubenswrapper[5127]: I0201 10:01:22.977162 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.136872 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-utilities\") pod \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.136947 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-catalog-content\") pod \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.137102 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tl9b\" (UniqueName: \"kubernetes.io/projected/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-kube-api-access-8tl9b\") pod \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\" (UID: \"75e27a5f-ad0b-46ff-9856-02dc095ed3e6\") " Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.137690 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-utilities" (OuterVolumeSpecName: "utilities") pod "75e27a5f-ad0b-46ff-9856-02dc095ed3e6" (UID: "75e27a5f-ad0b-46ff-9856-02dc095ed3e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.143343 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-kube-api-access-8tl9b" (OuterVolumeSpecName: "kube-api-access-8tl9b") pod "75e27a5f-ad0b-46ff-9856-02dc095ed3e6" (UID: "75e27a5f-ad0b-46ff-9856-02dc095ed3e6"). InnerVolumeSpecName "kube-api-access-8tl9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.182131 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75e27a5f-ad0b-46ff-9856-02dc095ed3e6" (UID: "75e27a5f-ad0b-46ff-9856-02dc095ed3e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.219373 5127 generic.go:334] "Generic (PLEG): container finished" podID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerID="5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93" exitCode=0 Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.219415 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fllb" event={"ID":"75e27a5f-ad0b-46ff-9856-02dc095ed3e6","Type":"ContainerDied","Data":"5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93"} Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.219438 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9fllb" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.219457 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9fllb" event={"ID":"75e27a5f-ad0b-46ff-9856-02dc095ed3e6","Type":"ContainerDied","Data":"43bce1d02ac8c3eebd702d290738b8a1e546a06479e130addcf9372434853fe9"} Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.219476 5127 scope.go:117] "RemoveContainer" containerID="5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.239273 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tl9b\" (UniqueName: \"kubernetes.io/projected/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-kube-api-access-8tl9b\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.239600 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.239616 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e27a5f-ad0b-46ff-9856-02dc095ed3e6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.244217 5127 scope.go:117] "RemoveContainer" containerID="e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.257918 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9fllb"] Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.273180 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9fllb"] Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.280362 5127 scope.go:117] "RemoveContainer" containerID="ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.317690 5127 scope.go:117] "RemoveContainer" containerID="5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93" Feb 01 10:01:23 crc kubenswrapper[5127]: E0201 10:01:23.318187 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93\": container with ID starting with 5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93 not found: ID does not exist" containerID="5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.318226 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93"} err="failed to get container status \"5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93\": rpc error: code = NotFound desc = could not find container \"5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93\": container with ID starting with 5187ab8e0f6c2ff23bf1a18d7eecebcf8140ad8a6f029c76e4050fc84109ca93 not found: ID does not exist" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.318254 5127 scope.go:117] "RemoveContainer" containerID="e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf" Feb 01 10:01:23 crc kubenswrapper[5127]: E0201 10:01:23.318548 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf\": container with ID starting with e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf not found: ID does not exist" containerID="e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.318598 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf"} err="failed to get container status \"e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf\": rpc error: code = NotFound desc = could not find container \"e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf\": container with ID starting with e68c5f2960e52a4cd67bdbeacb1e2666b79a6283788ad4f6d9f9bca86d6c9daf not found: ID does not exist" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.318619 5127 scope.go:117] "RemoveContainer" containerID="ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f" Feb 01 10:01:23 crc kubenswrapper[5127]: E0201 10:01:23.318951 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f\": container with ID starting with ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f not found: ID does not exist" containerID="ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f" Feb 01 10:01:23 crc kubenswrapper[5127]: I0201 10:01:23.318996 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f"} err="failed to get container status \"ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f\": rpc error: code = NotFound desc = could not find container \"ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f\": container with ID starting with ec957bd21dd273887a4e4805a61a5a4a75fb5bbc27e6807a7c3b425f02e24c3f not found: ID does not exist" Feb 01 10:01:24 crc kubenswrapper[5127]: I0201 10:01:24.256335 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" path="/var/lib/kubelet/pods/75e27a5f-ad0b-46ff-9856-02dc095ed3e6/volumes" Feb 01 10:01:25 crc kubenswrapper[5127]: I0201 10:01:25.357864 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-59q9q" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="registry-server" probeResult="failure" output=< Feb 01 10:01:25 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 10:01:25 crc kubenswrapper[5127]: > Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.572043 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jjbs4"] Feb 01 10:01:27 crc kubenswrapper[5127]: E0201 10:01:27.572916 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="extract-content" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.572929 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="extract-content" Feb 01 10:01:27 crc kubenswrapper[5127]: E0201 10:01:27.572948 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff2f742-9eb6-450f-af81-ca38b3654694" containerName="keystone-cron" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.572955 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff2f742-9eb6-450f-af81-ca38b3654694" containerName="keystone-cron" Feb 01 10:01:27 crc kubenswrapper[5127]: E0201 10:01:27.572982 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="registry-server" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.572988 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="registry-server" Feb 01 10:01:27 crc kubenswrapper[5127]: E0201 10:01:27.573003 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="extract-utilities" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.573009 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="extract-utilities" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.573179 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff2f742-9eb6-450f-af81-ca38b3654694" containerName="keystone-cron" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.573204 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e27a5f-ad0b-46ff-9856-02dc095ed3e6" containerName="registry-server" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.574609 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.588206 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjbs4"] Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.740824 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-catalog-content\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.740951 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtczs\" (UniqueName: \"kubernetes.io/projected/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-kube-api-access-xtczs\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.741096 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-utilities\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.843523 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-utilities\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.843678 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-catalog-content\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.843712 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtczs\" (UniqueName: \"kubernetes.io/projected/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-kube-api-access-xtczs\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.844174 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-utilities\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.844362 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-catalog-content\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.865949 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtczs\" (UniqueName: \"kubernetes.io/projected/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-kube-api-access-xtczs\") pod \"redhat-marketplace-jjbs4\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:27 crc kubenswrapper[5127]: I0201 10:01:27.894866 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:28 crc kubenswrapper[5127]: I0201 10:01:28.434783 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjbs4"] Feb 01 10:01:29 crc kubenswrapper[5127]: I0201 10:01:29.303140 5127 generic.go:334] "Generic (PLEG): container finished" podID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerID="dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7" exitCode=0 Feb 01 10:01:29 crc kubenswrapper[5127]: I0201 10:01:29.303219 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjbs4" event={"ID":"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637","Type":"ContainerDied","Data":"dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7"} Feb 01 10:01:29 crc kubenswrapper[5127]: I0201 10:01:29.303471 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjbs4" event={"ID":"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637","Type":"ContainerStarted","Data":"124411c48d9a10c7524216042a11055d2269e201e787c898514e49b7517f75a0"} Feb 01 10:01:31 crc kubenswrapper[5127]: I0201 10:01:31.331121 5127 generic.go:334] "Generic (PLEG): container finished" podID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerID="8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7" exitCode=0 Feb 01 10:01:31 crc kubenswrapper[5127]: I0201 10:01:31.331316 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjbs4" event={"ID":"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637","Type":"ContainerDied","Data":"8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7"} Feb 01 10:01:32 crc kubenswrapper[5127]: I0201 10:01:32.345366 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjbs4" event={"ID":"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637","Type":"ContainerStarted","Data":"6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd"} Feb 01 10:01:32 crc kubenswrapper[5127]: I0201 10:01:32.369630 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jjbs4" podStartSLOduration=2.950349412 podStartE2EDuration="5.369610221s" podCreationTimestamp="2026-02-01 10:01:27 +0000 UTC" firstStartedPulling="2026-02-01 10:01:29.305489503 +0000 UTC m=+11639.791391866" lastFinishedPulling="2026-02-01 10:01:31.724750292 +0000 UTC m=+11642.210652675" observedRunningTime="2026-02-01 10:01:32.363037885 +0000 UTC m=+11642.848940268" watchObservedRunningTime="2026-02-01 10:01:32.369610221 +0000 UTC m=+11642.855512584" Feb 01 10:01:34 crc kubenswrapper[5127]: I0201 10:01:34.334310 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:34 crc kubenswrapper[5127]: I0201 10:01:34.383060 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:35 crc kubenswrapper[5127]: I0201 10:01:35.755654 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59q9q"] Feb 01 10:01:35 crc kubenswrapper[5127]: I0201 10:01:35.756154 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-59q9q" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="registry-server" containerID="cri-o://7e641fa3263af64675546b0ea4fa6bf9da8d2585376bcd591218b84c9a8cb993" gracePeriod=2 Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.381535 5127 generic.go:334] "Generic (PLEG): container finished" podID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerID="7e641fa3263af64675546b0ea4fa6bf9da8d2585376bcd591218b84c9a8cb993" exitCode=0 Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.381943 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59q9q" event={"ID":"32cfa615-c302-4cdd-acfd-80c9fa516190","Type":"ContainerDied","Data":"7e641fa3263af64675546b0ea4fa6bf9da8d2585376bcd591218b84c9a8cb993"} Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.493068 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.642621 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-catalog-content\") pod \"32cfa615-c302-4cdd-acfd-80c9fa516190\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.642676 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxn4h\" (UniqueName: \"kubernetes.io/projected/32cfa615-c302-4cdd-acfd-80c9fa516190-kube-api-access-hxn4h\") pod \"32cfa615-c302-4cdd-acfd-80c9fa516190\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.642811 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-utilities\") pod \"32cfa615-c302-4cdd-acfd-80c9fa516190\" (UID: \"32cfa615-c302-4cdd-acfd-80c9fa516190\") " Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.643973 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-utilities" (OuterVolumeSpecName: "utilities") pod "32cfa615-c302-4cdd-acfd-80c9fa516190" (UID: "32cfa615-c302-4cdd-acfd-80c9fa516190"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.648986 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32cfa615-c302-4cdd-acfd-80c9fa516190-kube-api-access-hxn4h" (OuterVolumeSpecName: "kube-api-access-hxn4h") pod "32cfa615-c302-4cdd-acfd-80c9fa516190" (UID: "32cfa615-c302-4cdd-acfd-80c9fa516190"). InnerVolumeSpecName "kube-api-access-hxn4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.745169 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxn4h\" (UniqueName: \"kubernetes.io/projected/32cfa615-c302-4cdd-acfd-80c9fa516190-kube-api-access-hxn4h\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.745495 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.760820 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32cfa615-c302-4cdd-acfd-80c9fa516190" (UID: "32cfa615-c302-4cdd-acfd-80c9fa516190"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:01:36 crc kubenswrapper[5127]: I0201 10:01:36.847368 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32cfa615-c302-4cdd-acfd-80c9fa516190-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.399711 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59q9q" event={"ID":"32cfa615-c302-4cdd-acfd-80c9fa516190","Type":"ContainerDied","Data":"e23691033b283d18600da5dc48207433fc07bf16349e4269fcc4207f001dfe24"} Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.399808 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59q9q" Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.400116 5127 scope.go:117] "RemoveContainer" containerID="7e641fa3263af64675546b0ea4fa6bf9da8d2585376bcd591218b84c9a8cb993" Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.443795 5127 scope.go:117] "RemoveContainer" containerID="2472f9c753216ac814aad492288bd97235d16f9bc4597dc86fdc1ba5edf86780" Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.468537 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59q9q"] Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.485644 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-59q9q"] Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.504641 5127 scope.go:117] "RemoveContainer" containerID="640dfca0d81f8606fd8bee40e4435afe34e9f4af7926901c3572b9818f2948ab" Feb 01 10:01:37 crc kubenswrapper[5127]: E0201 10:01:37.600947 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32cfa615_c302_4cdd_acfd_80c9fa516190.slice/crio-e23691033b283d18600da5dc48207433fc07bf16349e4269fcc4207f001dfe24\": RecentStats: unable to find data in memory cache]" Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.896108 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.896408 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:37 crc kubenswrapper[5127]: I0201 10:01:37.961175 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:38 crc kubenswrapper[5127]: I0201 10:01:38.245411 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" path="/var/lib/kubelet/pods/32cfa615-c302-4cdd-acfd-80c9fa516190/volumes" Feb 01 10:01:38 crc kubenswrapper[5127]: I0201 10:01:38.500888 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:39 crc kubenswrapper[5127]: I0201 10:01:39.954569 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjbs4"] Feb 01 10:01:40 crc kubenswrapper[5127]: I0201 10:01:40.433411 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jjbs4" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="registry-server" containerID="cri-o://6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd" gracePeriod=2 Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.171472 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.242056 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-utilities\") pod \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.242262 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-catalog-content\") pod \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.242426 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtczs\" (UniqueName: \"kubernetes.io/projected/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-kube-api-access-xtczs\") pod \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\" (UID: \"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637\") " Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.247493 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-utilities" (OuterVolumeSpecName: "utilities") pod "7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" (UID: "7fa964a1-5fc0-4fe0-bf9d-29290d8a2637"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.249717 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-kube-api-access-xtczs" (OuterVolumeSpecName: "kube-api-access-xtczs") pod "7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" (UID: "7fa964a1-5fc0-4fe0-bf9d-29290d8a2637"). InnerVolumeSpecName "kube-api-access-xtczs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.270269 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" (UID: "7fa964a1-5fc0-4fe0-bf9d-29290d8a2637"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.346084 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.346115 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.346126 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtczs\" (UniqueName: \"kubernetes.io/projected/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637-kube-api-access-xtczs\") on node \"crc\" DevicePath \"\"" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.444920 5127 generic.go:334] "Generic (PLEG): container finished" podID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerID="6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd" exitCode=0 Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.445083 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jjbs4" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.445113 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjbs4" event={"ID":"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637","Type":"ContainerDied","Data":"6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd"} Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.445523 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jjbs4" event={"ID":"7fa964a1-5fc0-4fe0-bf9d-29290d8a2637","Type":"ContainerDied","Data":"124411c48d9a10c7524216042a11055d2269e201e787c898514e49b7517f75a0"} Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.445572 5127 scope.go:117] "RemoveContainer" containerID="6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.470370 5127 scope.go:117] "RemoveContainer" containerID="8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.501623 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjbs4"] Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.504299 5127 scope.go:117] "RemoveContainer" containerID="dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.517225 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jjbs4"] Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.572662 5127 scope.go:117] "RemoveContainer" containerID="6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd" Feb 01 10:01:41 crc kubenswrapper[5127]: E0201 10:01:41.575835 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd\": container with ID starting with 6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd not found: ID does not exist" containerID="6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.575885 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd"} err="failed to get container status \"6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd\": rpc error: code = NotFound desc = could not find container \"6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd\": container with ID starting with 6654c9f2d97cb2eec8769d579efa4c71c23796ad3dda48123fdb323c639681dd not found: ID does not exist" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.576266 5127 scope.go:117] "RemoveContainer" containerID="8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7" Feb 01 10:01:41 crc kubenswrapper[5127]: E0201 10:01:41.578705 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7\": container with ID starting with 8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7 not found: ID does not exist" containerID="8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.578778 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7"} err="failed to get container status \"8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7\": rpc error: code = NotFound desc = could not find container \"8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7\": container with ID starting with 8ed17498d8d139677460228aef91c14e12ec32a4f66800114d67ffc335fbe3f7 not found: ID does not exist" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.578823 5127 scope.go:117] "RemoveContainer" containerID="dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7" Feb 01 10:01:41 crc kubenswrapper[5127]: E0201 10:01:41.579262 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7\": container with ID starting with dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7 not found: ID does not exist" containerID="dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7" Feb 01 10:01:41 crc kubenswrapper[5127]: I0201 10:01:41.579326 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7"} err="failed to get container status \"dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7\": rpc error: code = NotFound desc = could not find container \"dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7\": container with ID starting with dd89ac1c6b71b13043a2c15467d0f19cbda4a8362aa15983fb7e7b054c6656f7 not found: ID does not exist" Feb 01 10:01:42 crc kubenswrapper[5127]: I0201 10:01:42.247225 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" path="/var/lib/kubelet/pods/7fa964a1-5fc0-4fe0-bf9d-29290d8a2637/volumes" Feb 01 10:02:36 crc kubenswrapper[5127]: I0201 10:02:36.740403 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:02:36 crc kubenswrapper[5127]: I0201 10:02:36.741093 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:03:06 crc kubenswrapper[5127]: I0201 10:03:06.740305 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:03:06 crc kubenswrapper[5127]: I0201 10:03:06.740923 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:03:36 crc kubenswrapper[5127]: I0201 10:03:36.741107 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:03:36 crc kubenswrapper[5127]: I0201 10:03:36.741643 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:03:36 crc kubenswrapper[5127]: I0201 10:03:36.741753 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 10:03:36 crc kubenswrapper[5127]: I0201 10:03:36.742797 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 10:03:36 crc kubenswrapper[5127]: I0201 10:03:36.742868 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" gracePeriod=600 Feb 01 10:03:36 crc kubenswrapper[5127]: E0201 10:03:36.901976 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:03:37 crc kubenswrapper[5127]: I0201 10:03:37.591808 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" exitCode=0 Feb 01 10:03:37 crc kubenswrapper[5127]: I0201 10:03:37.591851 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be"} Feb 01 10:03:37 crc kubenswrapper[5127]: I0201 10:03:37.592149 5127 scope.go:117] "RemoveContainer" containerID="9454e01d3fc6bb7a917a7708fe888102b07cb8cf9719add9b1516c629ea60e75" Feb 01 10:03:37 crc kubenswrapper[5127]: I0201 10:03:37.592843 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:03:37 crc kubenswrapper[5127]: E0201 10:03:37.593112 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:03:52 crc kubenswrapper[5127]: I0201 10:03:52.236187 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:03:52 crc kubenswrapper[5127]: E0201 10:03:52.237202 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:04:06 crc kubenswrapper[5127]: I0201 10:04:06.237183 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:04:06 crc kubenswrapper[5127]: E0201 10:04:06.238038 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:04:19 crc kubenswrapper[5127]: I0201 10:04:19.235491 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:04:19 crc kubenswrapper[5127]: E0201 10:04:19.236627 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:04:31 crc kubenswrapper[5127]: I0201 10:04:31.235543 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:04:31 crc kubenswrapper[5127]: E0201 10:04:31.236314 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:04:44 crc kubenswrapper[5127]: I0201 10:04:44.236307 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:04:44 crc kubenswrapper[5127]: E0201 10:04:44.237339 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:04:56 crc kubenswrapper[5127]: I0201 10:04:56.236078 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:04:56 crc kubenswrapper[5127]: E0201 10:04:56.238916 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:05:10 crc kubenswrapper[5127]: I0201 10:05:10.244966 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:05:10 crc kubenswrapper[5127]: E0201 10:05:10.245784 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:05:25 crc kubenswrapper[5127]: I0201 10:05:25.236065 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:05:25 crc kubenswrapper[5127]: E0201 10:05:25.236769 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:05:38 crc kubenswrapper[5127]: I0201 10:05:38.236921 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:05:38 crc kubenswrapper[5127]: E0201 10:05:38.237767 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:05:49 crc kubenswrapper[5127]: I0201 10:05:49.235998 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:05:49 crc kubenswrapper[5127]: E0201 10:05:49.236963 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:06:02 crc kubenswrapper[5127]: I0201 10:06:02.236762 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:06:02 crc kubenswrapper[5127]: E0201 10:06:02.237896 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:06:13 crc kubenswrapper[5127]: I0201 10:06:13.236714 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:06:13 crc kubenswrapper[5127]: E0201 10:06:13.238857 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:06:26 crc kubenswrapper[5127]: I0201 10:06:26.236540 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:06:26 crc kubenswrapper[5127]: E0201 10:06:26.238055 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:06:41 crc kubenswrapper[5127]: I0201 10:06:41.236930 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:06:41 crc kubenswrapper[5127]: E0201 10:06:41.237952 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:06:56 crc kubenswrapper[5127]: I0201 10:06:56.236004 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:06:56 crc kubenswrapper[5127]: E0201 10:06:56.238275 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:07:04 crc kubenswrapper[5127]: I0201 10:07:04.049491 5127 generic.go:334] "Generic (PLEG): container finished" podID="068a067f-bb12-4a63-a3ed-7eb05da0ca52" containerID="6773599dfdea5689480cd11f1c215fa9fa1892c25678ce82b49cb17238b1482a" exitCode=0 Feb 01 10:07:04 crc kubenswrapper[5127]: I0201 10:07:04.049662 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"068a067f-bb12-4a63-a3ed-7eb05da0ca52","Type":"ContainerDied","Data":"6773599dfdea5689480cd11f1c215fa9fa1892c25678ce82b49cb17238b1482a"} Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.588996 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.716681 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.716730 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-config-data\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.716760 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.716804 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ca-certs\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.716878 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config-secret\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.716912 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ssh-key\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.716945 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-temporary\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.717042 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-workdir\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.717180 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpxhx\" (UniqueName: \"kubernetes.io/projected/068a067f-bb12-4a63-a3ed-7eb05da0ca52-kube-api-access-kpxhx\") pod \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\" (UID: \"068a067f-bb12-4a63-a3ed-7eb05da0ca52\") " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.720676 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.722361 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-config-data" (OuterVolumeSpecName: "config-data") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.724165 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.733820 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.735345 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068a067f-bb12-4a63-a3ed-7eb05da0ca52-kube-api-access-kpxhx" (OuterVolumeSpecName: "kube-api-access-kpxhx") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "kube-api-access-kpxhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.747298 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.752740 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.776904 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.786695 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "068a067f-bb12-4a63-a3ed-7eb05da0ca52" (UID: "068a067f-bb12-4a63-a3ed-7eb05da0ca52"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820379 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpxhx\" (UniqueName: \"kubernetes.io/projected/068a067f-bb12-4a63-a3ed-7eb05da0ca52-kube-api-access-kpxhx\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820414 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820428 5127 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/068a067f-bb12-4a63-a3ed-7eb05da0ca52-config-data\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820458 5127 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820471 5127 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820484 5127 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820495 5127 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/068a067f-bb12-4a63-a3ed-7eb05da0ca52-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820508 5127 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.820523 5127 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/068a067f-bb12-4a63-a3ed-7eb05da0ca52-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.848104 5127 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 01 10:07:05 crc kubenswrapper[5127]: I0201 10:07:05.922169 5127 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 01 10:07:06 crc kubenswrapper[5127]: I0201 10:07:06.074887 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"068a067f-bb12-4a63-a3ed-7eb05da0ca52","Type":"ContainerDied","Data":"0c8da886155c4ebc5f1c6c78ea94397be421656a3acc578c08938135955a5fc1"} Feb 01 10:07:06 crc kubenswrapper[5127]: I0201 10:07:06.075222 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c8da886155c4ebc5f1c6c78ea94397be421656a3acc578c08938135955a5fc1" Feb 01 10:07:06 crc kubenswrapper[5127]: I0201 10:07:06.074976 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.236928 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.238270 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303322 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.303727 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068a067f-bb12-4a63-a3ed-7eb05da0ca52" containerName="tempest-tests-tempest-tests-runner" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303739 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="068a067f-bb12-4a63-a3ed-7eb05da0ca52" containerName="tempest-tests-tempest-tests-runner" Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.303756 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="extract-utilities" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303766 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="extract-utilities" Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.303779 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="extract-utilities" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303785 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="extract-utilities" Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.303801 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="extract-content" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303809 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="extract-content" Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.303824 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="extract-content" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303830 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="extract-content" Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.303848 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="registry-server" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303853 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="registry-server" Feb 01 10:07:09 crc kubenswrapper[5127]: E0201 10:07:09.303865 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="registry-server" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.303870 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="registry-server" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.304052 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="068a067f-bb12-4a63-a3ed-7eb05da0ca52" containerName="tempest-tests-tempest-tests-runner" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.304086 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fa964a1-5fc0-4fe0-bf9d-29290d8a2637" containerName="registry-server" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.304093 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="32cfa615-c302-4cdd-acfd-80c9fa516190" containerName="registry-server" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.304814 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.309271 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-hxbhz" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.330360 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.499962 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ead6462a-b14a-4cac-820a-67b142ddfc86\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.500059 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89w9f\" (UniqueName: \"kubernetes.io/projected/ead6462a-b14a-4cac-820a-67b142ddfc86-kube-api-access-89w9f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ead6462a-b14a-4cac-820a-67b142ddfc86\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.602589 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ead6462a-b14a-4cac-820a-67b142ddfc86\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.602674 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89w9f\" (UniqueName: \"kubernetes.io/projected/ead6462a-b14a-4cac-820a-67b142ddfc86-kube-api-access-89w9f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ead6462a-b14a-4cac-820a-67b142ddfc86\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.603660 5127 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ead6462a-b14a-4cac-820a-67b142ddfc86\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.635621 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89w9f\" (UniqueName: \"kubernetes.io/projected/ead6462a-b14a-4cac-820a-67b142ddfc86-kube-api-access-89w9f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ead6462a-b14a-4cac-820a-67b142ddfc86\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.644522 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"ead6462a-b14a-4cac-820a-67b142ddfc86\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:09 crc kubenswrapper[5127]: I0201 10:07:09.938206 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 01 10:07:10 crc kubenswrapper[5127]: I0201 10:07:10.491100 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 10:07:10 crc kubenswrapper[5127]: I0201 10:07:10.500619 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 01 10:07:11 crc kubenswrapper[5127]: I0201 10:07:11.142954 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ead6462a-b14a-4cac-820a-67b142ddfc86","Type":"ContainerStarted","Data":"051b268ca166025336e18eff68f7e832fc55b839452a084a47d285e6c86e5d62"} Feb 01 10:07:12 crc kubenswrapper[5127]: I0201 10:07:12.159905 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"ead6462a-b14a-4cac-820a-67b142ddfc86","Type":"ContainerStarted","Data":"8896dc42099cca82a40ba54faedd08bfefe9dcb97a58a7ba3c5f13e8ac3fc291"} Feb 01 10:07:12 crc kubenswrapper[5127]: I0201 10:07:12.193503 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.988252836 podStartE2EDuration="3.193473492s" podCreationTimestamp="2026-02-01 10:07:09 +0000 UTC" firstStartedPulling="2026-02-01 10:07:10.490564736 +0000 UTC m=+11980.976467139" lastFinishedPulling="2026-02-01 10:07:11.695785402 +0000 UTC m=+11982.181687795" observedRunningTime="2026-02-01 10:07:12.176971438 +0000 UTC m=+11982.662873831" watchObservedRunningTime="2026-02-01 10:07:12.193473492 +0000 UTC m=+11982.679375885" Feb 01 10:07:22 crc kubenswrapper[5127]: I0201 10:07:22.235786 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:07:22 crc kubenswrapper[5127]: E0201 10:07:22.236775 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:07:34 crc kubenswrapper[5127]: I0201 10:07:34.235992 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:07:34 crc kubenswrapper[5127]: E0201 10:07:34.236995 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:07:45 crc kubenswrapper[5127]: I0201 10:07:45.235995 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:07:45 crc kubenswrapper[5127]: E0201 10:07:45.237217 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:07:58 crc kubenswrapper[5127]: I0201 10:07:58.235960 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:07:58 crc kubenswrapper[5127]: E0201 10:07:58.236843 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:08:12 crc kubenswrapper[5127]: I0201 10:08:12.236396 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:08:12 crc kubenswrapper[5127]: E0201 10:08:12.237793 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:08:25 crc kubenswrapper[5127]: I0201 10:08:25.236150 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:08:25 crc kubenswrapper[5127]: E0201 10:08:25.236952 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.808105 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lw44g/must-gather-664vp"] Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.810390 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.812222 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lw44g"/"kube-root-ca.crt" Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.812338 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lw44g"/"openshift-service-ca.crt" Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.812751 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lw44g"/"default-dockercfg-cbctd" Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.816147 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lw44g/must-gather-664vp"] Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.915477 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssq6s\" (UniqueName: \"kubernetes.io/projected/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-kube-api-access-ssq6s\") pod \"must-gather-664vp\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:29 crc kubenswrapper[5127]: I0201 10:08:29.915811 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-must-gather-output\") pod \"must-gather-664vp\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:30 crc kubenswrapper[5127]: I0201 10:08:30.018037 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssq6s\" (UniqueName: \"kubernetes.io/projected/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-kube-api-access-ssq6s\") pod \"must-gather-664vp\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:30 crc kubenswrapper[5127]: I0201 10:08:30.025026 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-must-gather-output\") pod \"must-gather-664vp\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:30 crc kubenswrapper[5127]: I0201 10:08:30.025704 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-must-gather-output\") pod \"must-gather-664vp\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:30 crc kubenswrapper[5127]: I0201 10:08:30.057263 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssq6s\" (UniqueName: \"kubernetes.io/projected/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-kube-api-access-ssq6s\") pod \"must-gather-664vp\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:30 crc kubenswrapper[5127]: I0201 10:08:30.127089 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:08:30 crc kubenswrapper[5127]: I0201 10:08:30.755632 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lw44g/must-gather-664vp"] Feb 01 10:08:31 crc kubenswrapper[5127]: I0201 10:08:31.178235 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/must-gather-664vp" event={"ID":"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520","Type":"ContainerStarted","Data":"0781ac16dc1982d60030661284c2c9b38fe6c234807d7c6fc03c944c2daabf0a"} Feb 01 10:08:35 crc kubenswrapper[5127]: I0201 10:08:35.221676 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/must-gather-664vp" event={"ID":"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520","Type":"ContainerStarted","Data":"bd5f91be99d2c3b3d4171299c5a90f0c59d4417d7d060480a10a77638904ea73"} Feb 01 10:08:36 crc kubenswrapper[5127]: I0201 10:08:36.235785 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:08:36 crc kubenswrapper[5127]: E0201 10:08:36.236948 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:08:36 crc kubenswrapper[5127]: I0201 10:08:36.246901 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/must-gather-664vp" event={"ID":"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520","Type":"ContainerStarted","Data":"20ec62c3eb2c70ccde5e98553374fba2b49e0e484bea28a21b78dd7515089749"} Feb 01 10:08:36 crc kubenswrapper[5127]: I0201 10:08:36.264431 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lw44g/must-gather-664vp" podStartSLOduration=3.268863269 podStartE2EDuration="7.264406638s" podCreationTimestamp="2026-02-01 10:08:29 +0000 UTC" firstStartedPulling="2026-02-01 10:08:30.753242942 +0000 UTC m=+12061.239145305" lastFinishedPulling="2026-02-01 10:08:34.748786301 +0000 UTC m=+12065.234688674" observedRunningTime="2026-02-01 10:08:36.259813104 +0000 UTC m=+12066.745715507" watchObservedRunningTime="2026-02-01 10:08:36.264406638 +0000 UTC m=+12066.750309021" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.133435 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lw44g/crc-debug-wmmm5"] Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.146159 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.257354 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj9fn\" (UniqueName: \"kubernetes.io/projected/8c452171-25e7-4800-bc21-1f821edd7793-kube-api-access-gj9fn\") pod \"crc-debug-wmmm5\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.257424 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c452171-25e7-4800-bc21-1f821edd7793-host\") pod \"crc-debug-wmmm5\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.358372 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj9fn\" (UniqueName: \"kubernetes.io/projected/8c452171-25e7-4800-bc21-1f821edd7793-kube-api-access-gj9fn\") pod \"crc-debug-wmmm5\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.358443 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c452171-25e7-4800-bc21-1f821edd7793-host\") pod \"crc-debug-wmmm5\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.358541 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c452171-25e7-4800-bc21-1f821edd7793-host\") pod \"crc-debug-wmmm5\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.385722 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj9fn\" (UniqueName: \"kubernetes.io/projected/8c452171-25e7-4800-bc21-1f821edd7793-kube-api-access-gj9fn\") pod \"crc-debug-wmmm5\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:41 crc kubenswrapper[5127]: I0201 10:08:41.468870 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:08:42 crc kubenswrapper[5127]: I0201 10:08:42.309445 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" event={"ID":"8c452171-25e7-4800-bc21-1f821edd7793","Type":"ContainerStarted","Data":"143fc2d32a3c98f1e607e37f0d501987550ce5c0421f276f068e9dfe7b9bdac6"} Feb 01 10:08:49 crc kubenswrapper[5127]: I0201 10:08:49.236568 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:08:51 crc kubenswrapper[5127]: I0201 10:08:51.413114 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" event={"ID":"8c452171-25e7-4800-bc21-1f821edd7793","Type":"ContainerStarted","Data":"0f00151e818c72264f8fe0fc1b3513537370c9031dc6a9bbe18578190c88d9e7"} Feb 01 10:08:51 crc kubenswrapper[5127]: I0201 10:08:51.418040 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"a98c5f20862ab64ef78617fa7fef99b3bcd55b0f95c97739ea8f9957bb7da176"} Feb 01 10:08:51 crc kubenswrapper[5127]: I0201 10:08:51.437336 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" podStartSLOduration=1.278182486 podStartE2EDuration="10.43730191s" podCreationTimestamp="2026-02-01 10:08:41 +0000 UTC" firstStartedPulling="2026-02-01 10:08:41.505189539 +0000 UTC m=+12071.991091912" lastFinishedPulling="2026-02-01 10:08:50.664308953 +0000 UTC m=+12081.150211336" observedRunningTime="2026-02-01 10:08:51.426869789 +0000 UTC m=+12081.912772192" watchObservedRunningTime="2026-02-01 10:08:51.43730191 +0000 UTC m=+12081.923204313" Feb 01 10:09:29 crc kubenswrapper[5127]: I0201 10:09:29.922334 5127 generic.go:334] "Generic (PLEG): container finished" podID="8c452171-25e7-4800-bc21-1f821edd7793" containerID="0f00151e818c72264f8fe0fc1b3513537370c9031dc6a9bbe18578190c88d9e7" exitCode=0 Feb 01 10:09:29 crc kubenswrapper[5127]: I0201 10:09:29.922900 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" event={"ID":"8c452171-25e7-4800-bc21-1f821edd7793","Type":"ContainerDied","Data":"0f00151e818c72264f8fe0fc1b3513537370c9031dc6a9bbe18578190c88d9e7"} Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.076402 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.118420 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lw44g/crc-debug-wmmm5"] Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.129858 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lw44g/crc-debug-wmmm5"] Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.164034 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c452171-25e7-4800-bc21-1f821edd7793-host\") pod \"8c452171-25e7-4800-bc21-1f821edd7793\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.164220 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c452171-25e7-4800-bc21-1f821edd7793-host" (OuterVolumeSpecName: "host") pod "8c452171-25e7-4800-bc21-1f821edd7793" (UID: "8c452171-25e7-4800-bc21-1f821edd7793"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.164270 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj9fn\" (UniqueName: \"kubernetes.io/projected/8c452171-25e7-4800-bc21-1f821edd7793-kube-api-access-gj9fn\") pod \"8c452171-25e7-4800-bc21-1f821edd7793\" (UID: \"8c452171-25e7-4800-bc21-1f821edd7793\") " Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.164786 5127 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c452171-25e7-4800-bc21-1f821edd7793-host\") on node \"crc\" DevicePath \"\"" Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.172939 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c452171-25e7-4800-bc21-1f821edd7793-kube-api-access-gj9fn" (OuterVolumeSpecName: "kube-api-access-gj9fn") pod "8c452171-25e7-4800-bc21-1f821edd7793" (UID: "8c452171-25e7-4800-bc21-1f821edd7793"). InnerVolumeSpecName "kube-api-access-gj9fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.266278 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj9fn\" (UniqueName: \"kubernetes.io/projected/8c452171-25e7-4800-bc21-1f821edd7793-kube-api-access-gj9fn\") on node \"crc\" DevicePath \"\"" Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.977125 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="143fc2d32a3c98f1e607e37f0d501987550ce5c0421f276f068e9dfe7b9bdac6" Feb 01 10:09:31 crc kubenswrapper[5127]: I0201 10:09:31.977404 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-wmmm5" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.256699 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c452171-25e7-4800-bc21-1f821edd7793" path="/var/lib/kubelet/pods/8c452171-25e7-4800-bc21-1f821edd7793/volumes" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.339193 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lw44g/crc-debug-mhcj9"] Feb 01 10:09:32 crc kubenswrapper[5127]: E0201 10:09:32.339735 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c452171-25e7-4800-bc21-1f821edd7793" containerName="container-00" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.339759 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c452171-25e7-4800-bc21-1f821edd7793" containerName="container-00" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.340048 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c452171-25e7-4800-bc21-1f821edd7793" containerName="container-00" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.340861 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.496805 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444b0f34-436e-469e-9e68-73ccbecfec13-host\") pod \"crc-debug-mhcj9\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.496964 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs6tb\" (UniqueName: \"kubernetes.io/projected/444b0f34-436e-469e-9e68-73ccbecfec13-kube-api-access-bs6tb\") pod \"crc-debug-mhcj9\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.600658 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444b0f34-436e-469e-9e68-73ccbecfec13-host\") pod \"crc-debug-mhcj9\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.600854 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444b0f34-436e-469e-9e68-73ccbecfec13-host\") pod \"crc-debug-mhcj9\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.600874 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs6tb\" (UniqueName: \"kubernetes.io/projected/444b0f34-436e-469e-9e68-73ccbecfec13-kube-api-access-bs6tb\") pod \"crc-debug-mhcj9\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.641838 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs6tb\" (UniqueName: \"kubernetes.io/projected/444b0f34-436e-469e-9e68-73ccbecfec13-kube-api-access-bs6tb\") pod \"crc-debug-mhcj9\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:32 crc kubenswrapper[5127]: I0201 10:09:32.659622 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:33 crc kubenswrapper[5127]: I0201 10:09:32.993315 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" event={"ID":"444b0f34-436e-469e-9e68-73ccbecfec13","Type":"ContainerStarted","Data":"6c9a6ae1276cb9452092776526e39621155b93e615efbc432d204b5386f566b8"} Feb 01 10:09:33 crc kubenswrapper[5127]: I0201 10:09:32.993706 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" event={"ID":"444b0f34-436e-469e-9e68-73ccbecfec13","Type":"ContainerStarted","Data":"a97b272b63629ed063358ad985e44a8dee543c69ba7fb17682a9dc2dcf68ebe7"} Feb 01 10:09:33 crc kubenswrapper[5127]: I0201 10:09:33.025572 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" podStartSLOduration=1.025550178 podStartE2EDuration="1.025550178s" podCreationTimestamp="2026-02-01 10:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 10:09:33.008644833 +0000 UTC m=+12123.494547186" watchObservedRunningTime="2026-02-01 10:09:33.025550178 +0000 UTC m=+12123.511452551" Feb 01 10:09:34 crc kubenswrapper[5127]: I0201 10:09:34.021964 5127 generic.go:334] "Generic (PLEG): container finished" podID="444b0f34-436e-469e-9e68-73ccbecfec13" containerID="6c9a6ae1276cb9452092776526e39621155b93e615efbc432d204b5386f566b8" exitCode=0 Feb 01 10:09:34 crc kubenswrapper[5127]: I0201 10:09:34.022005 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" event={"ID":"444b0f34-436e-469e-9e68-73ccbecfec13","Type":"ContainerDied","Data":"6c9a6ae1276cb9452092776526e39621155b93e615efbc432d204b5386f566b8"} Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.195668 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.246713 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lw44g/crc-debug-mhcj9"] Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.254356 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lw44g/crc-debug-mhcj9"] Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.369835 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs6tb\" (UniqueName: \"kubernetes.io/projected/444b0f34-436e-469e-9e68-73ccbecfec13-kube-api-access-bs6tb\") pod \"444b0f34-436e-469e-9e68-73ccbecfec13\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.370026 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444b0f34-436e-469e-9e68-73ccbecfec13-host\") pod \"444b0f34-436e-469e-9e68-73ccbecfec13\" (UID: \"444b0f34-436e-469e-9e68-73ccbecfec13\") " Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.370225 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/444b0f34-436e-469e-9e68-73ccbecfec13-host" (OuterVolumeSpecName: "host") pod "444b0f34-436e-469e-9e68-73ccbecfec13" (UID: "444b0f34-436e-469e-9e68-73ccbecfec13"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.371257 5127 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/444b0f34-436e-469e-9e68-73ccbecfec13-host\") on node \"crc\" DevicePath \"\"" Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.375839 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/444b0f34-436e-469e-9e68-73ccbecfec13-kube-api-access-bs6tb" (OuterVolumeSpecName: "kube-api-access-bs6tb") pod "444b0f34-436e-469e-9e68-73ccbecfec13" (UID: "444b0f34-436e-469e-9e68-73ccbecfec13"). InnerVolumeSpecName "kube-api-access-bs6tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:09:35 crc kubenswrapper[5127]: I0201 10:09:35.474346 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs6tb\" (UniqueName: \"kubernetes.io/projected/444b0f34-436e-469e-9e68-73ccbecfec13-kube-api-access-bs6tb\") on node \"crc\" DevicePath \"\"" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.049653 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97b272b63629ed063358ad985e44a8dee543c69ba7fb17682a9dc2dcf68ebe7" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.049774 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-mhcj9" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.250780 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="444b0f34-436e-469e-9e68-73ccbecfec13" path="/var/lib/kubelet/pods/444b0f34-436e-469e-9e68-73ccbecfec13/volumes" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.479381 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lw44g/crc-debug-4trgh"] Feb 01 10:09:36 crc kubenswrapper[5127]: E0201 10:09:36.480019 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="444b0f34-436e-469e-9e68-73ccbecfec13" containerName="container-00" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.480052 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="444b0f34-436e-469e-9e68-73ccbecfec13" containerName="container-00" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.480443 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="444b0f34-436e-469e-9e68-73ccbecfec13" containerName="container-00" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.481645 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.601814 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gptdn\" (UniqueName: \"kubernetes.io/projected/e251839b-6cbc-41f1-b6a4-534c9e313342-kube-api-access-gptdn\") pod \"crc-debug-4trgh\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.602322 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e251839b-6cbc-41f1-b6a4-534c9e313342-host\") pod \"crc-debug-4trgh\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.704282 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e251839b-6cbc-41f1-b6a4-534c9e313342-host\") pod \"crc-debug-4trgh\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.704422 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gptdn\" (UniqueName: \"kubernetes.io/projected/e251839b-6cbc-41f1-b6a4-534c9e313342-kube-api-access-gptdn\") pod \"crc-debug-4trgh\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.704506 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e251839b-6cbc-41f1-b6a4-534c9e313342-host\") pod \"crc-debug-4trgh\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.745852 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gptdn\" (UniqueName: \"kubernetes.io/projected/e251839b-6cbc-41f1-b6a4-534c9e313342-kube-api-access-gptdn\") pod \"crc-debug-4trgh\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:36 crc kubenswrapper[5127]: I0201 10:09:36.803408 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:37 crc kubenswrapper[5127]: I0201 10:09:37.063255 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-4trgh" event={"ID":"e251839b-6cbc-41f1-b6a4-534c9e313342","Type":"ContainerStarted","Data":"7104e8aaff15ca2b013fbb266c149848e4b34c9ebed7cd1e72167c7ccba67d91"} Feb 01 10:09:38 crc kubenswrapper[5127]: I0201 10:09:38.082798 5127 generic.go:334] "Generic (PLEG): container finished" podID="e251839b-6cbc-41f1-b6a4-534c9e313342" containerID="f0d2d8bffa6852a280e04d60fbf4c81c0df968a5df5a857bac6f4d65b4f4cf99" exitCode=0 Feb 01 10:09:38 crc kubenswrapper[5127]: I0201 10:09:38.082847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/crc-debug-4trgh" event={"ID":"e251839b-6cbc-41f1-b6a4-534c9e313342","Type":"ContainerDied","Data":"f0d2d8bffa6852a280e04d60fbf4c81c0df968a5df5a857bac6f4d65b4f4cf99"} Feb 01 10:09:38 crc kubenswrapper[5127]: I0201 10:09:38.150332 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lw44g/crc-debug-4trgh"] Feb 01 10:09:38 crc kubenswrapper[5127]: I0201 10:09:38.162981 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lw44g/crc-debug-4trgh"] Feb 01 10:09:39 crc kubenswrapper[5127]: I0201 10:09:39.205505 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:39 crc kubenswrapper[5127]: I0201 10:09:39.367897 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gptdn\" (UniqueName: \"kubernetes.io/projected/e251839b-6cbc-41f1-b6a4-534c9e313342-kube-api-access-gptdn\") pod \"e251839b-6cbc-41f1-b6a4-534c9e313342\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " Feb 01 10:09:39 crc kubenswrapper[5127]: I0201 10:09:39.368023 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e251839b-6cbc-41f1-b6a4-534c9e313342-host\") pod \"e251839b-6cbc-41f1-b6a4-534c9e313342\" (UID: \"e251839b-6cbc-41f1-b6a4-534c9e313342\") " Feb 01 10:09:39 crc kubenswrapper[5127]: I0201 10:09:39.368127 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e251839b-6cbc-41f1-b6a4-534c9e313342-host" (OuterVolumeSpecName: "host") pod "e251839b-6cbc-41f1-b6a4-534c9e313342" (UID: "e251839b-6cbc-41f1-b6a4-534c9e313342"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 10:09:39 crc kubenswrapper[5127]: I0201 10:09:39.369030 5127 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e251839b-6cbc-41f1-b6a4-534c9e313342-host\") on node \"crc\" DevicePath \"\"" Feb 01 10:09:39 crc kubenswrapper[5127]: I0201 10:09:39.383856 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e251839b-6cbc-41f1-b6a4-534c9e313342-kube-api-access-gptdn" (OuterVolumeSpecName: "kube-api-access-gptdn") pod "e251839b-6cbc-41f1-b6a4-534c9e313342" (UID: "e251839b-6cbc-41f1-b6a4-534c9e313342"). InnerVolumeSpecName "kube-api-access-gptdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:09:39 crc kubenswrapper[5127]: I0201 10:09:39.471750 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gptdn\" (UniqueName: \"kubernetes.io/projected/e251839b-6cbc-41f1-b6a4-534c9e313342-kube-api-access-gptdn\") on node \"crc\" DevicePath \"\"" Feb 01 10:09:40 crc kubenswrapper[5127]: I0201 10:09:40.112500 5127 scope.go:117] "RemoveContainer" containerID="f0d2d8bffa6852a280e04d60fbf4c81c0df968a5df5a857bac6f4d65b4f4cf99" Feb 01 10:09:40 crc kubenswrapper[5127]: I0201 10:09:40.112567 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/crc-debug-4trgh" Feb 01 10:09:40 crc kubenswrapper[5127]: I0201 10:09:40.251996 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e251839b-6cbc-41f1-b6a4-534c9e313342" path="/var/lib/kubelet/pods/e251839b-6cbc-41f1-b6a4-534c9e313342/volumes" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.052812 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xx2dx"] Feb 01 10:10:47 crc kubenswrapper[5127]: E0201 10:10:47.054721 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251839b-6cbc-41f1-b6a4-534c9e313342" containerName="container-00" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.054747 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251839b-6cbc-41f1-b6a4-534c9e313342" containerName="container-00" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.055060 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251839b-6cbc-41f1-b6a4-534c9e313342" containerName="container-00" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.057641 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.066898 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xx2dx"] Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.129441 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4sr6\" (UniqueName: \"kubernetes.io/projected/96661957-517e-40fd-a208-c0cf8c58c34c-kube-api-access-t4sr6\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.129531 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96661957-517e-40fd-a208-c0cf8c58c34c-utilities\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.129641 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96661957-517e-40fd-a208-c0cf8c58c34c-catalog-content\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.232847 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4sr6\" (UniqueName: \"kubernetes.io/projected/96661957-517e-40fd-a208-c0cf8c58c34c-kube-api-access-t4sr6\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.232947 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96661957-517e-40fd-a208-c0cf8c58c34c-utilities\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.232996 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96661957-517e-40fd-a208-c0cf8c58c34c-catalog-content\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.233441 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96661957-517e-40fd-a208-c0cf8c58c34c-utilities\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.233534 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96661957-517e-40fd-a208-c0cf8c58c34c-catalog-content\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.262533 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4sr6\" (UniqueName: \"kubernetes.io/projected/96661957-517e-40fd-a208-c0cf8c58c34c-kube-api-access-t4sr6\") pod \"certified-operators-xx2dx\" (UID: \"96661957-517e-40fd-a208-c0cf8c58c34c\") " pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:47 crc kubenswrapper[5127]: I0201 10:10:47.382033 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:48 crc kubenswrapper[5127]: I0201 10:10:48.011444 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xx2dx"] Feb 01 10:10:49 crc kubenswrapper[5127]: I0201 10:10:49.037653 5127 generic.go:334] "Generic (PLEG): container finished" podID="96661957-517e-40fd-a208-c0cf8c58c34c" containerID="4d5836d56fad5f6149bc8fd017eae79b147cd1bf1a4476d0e7b37a173b36bbf8" exitCode=0 Feb 01 10:10:49 crc kubenswrapper[5127]: I0201 10:10:49.037847 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xx2dx" event={"ID":"96661957-517e-40fd-a208-c0cf8c58c34c","Type":"ContainerDied","Data":"4d5836d56fad5f6149bc8fd017eae79b147cd1bf1a4476d0e7b37a173b36bbf8"} Feb 01 10:10:49 crc kubenswrapper[5127]: I0201 10:10:49.038371 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xx2dx" event={"ID":"96661957-517e-40fd-a208-c0cf8c58c34c","Type":"ContainerStarted","Data":"4c718e64b974a29d5de6233bc9fac2c6b3102a19bfa8e717cdf6123dede4f290"} Feb 01 10:10:55 crc kubenswrapper[5127]: I0201 10:10:55.111515 5127 generic.go:334] "Generic (PLEG): container finished" podID="96661957-517e-40fd-a208-c0cf8c58c34c" containerID="0bebb9abd993a5044ae0ccf9d7436f6ad57a4936da0686570d419f0ebde1ff1f" exitCode=0 Feb 01 10:10:55 crc kubenswrapper[5127]: I0201 10:10:55.111614 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xx2dx" event={"ID":"96661957-517e-40fd-a208-c0cf8c58c34c","Type":"ContainerDied","Data":"0bebb9abd993a5044ae0ccf9d7436f6ad57a4936da0686570d419f0ebde1ff1f"} Feb 01 10:10:56 crc kubenswrapper[5127]: I0201 10:10:56.122985 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xx2dx" event={"ID":"96661957-517e-40fd-a208-c0cf8c58c34c","Type":"ContainerStarted","Data":"0d6ab8edd71c1610b410eb95276d62787cca5ca95149ceba66f773c1709f3acd"} Feb 01 10:10:56 crc kubenswrapper[5127]: I0201 10:10:56.141534 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xx2dx" podStartSLOduration=2.435644156 podStartE2EDuration="9.141517083s" podCreationTimestamp="2026-02-01 10:10:47 +0000 UTC" firstStartedPulling="2026-02-01 10:10:49.043904794 +0000 UTC m=+12199.529807157" lastFinishedPulling="2026-02-01 10:10:55.749777721 +0000 UTC m=+12206.235680084" observedRunningTime="2026-02-01 10:10:56.1376825 +0000 UTC m=+12206.623584863" watchObservedRunningTime="2026-02-01 10:10:56.141517083 +0000 UTC m=+12206.627419446" Feb 01 10:10:57 crc kubenswrapper[5127]: I0201 10:10:57.383134 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:57 crc kubenswrapper[5127]: I0201 10:10:57.383531 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:10:58 crc kubenswrapper[5127]: I0201 10:10:58.433974 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-xx2dx" podUID="96661957-517e-40fd-a208-c0cf8c58c34c" containerName="registry-server" probeResult="failure" output=< Feb 01 10:10:58 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 10:10:58 crc kubenswrapper[5127]: > Feb 01 10:11:06 crc kubenswrapper[5127]: I0201 10:11:06.740969 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:11:06 crc kubenswrapper[5127]: I0201 10:11:06.741411 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:11:07 crc kubenswrapper[5127]: I0201 10:11:07.443179 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:11:07 crc kubenswrapper[5127]: I0201 10:11:07.497352 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xx2dx" Feb 01 10:11:07 crc kubenswrapper[5127]: I0201 10:11:07.572143 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xx2dx"] Feb 01 10:11:07 crc kubenswrapper[5127]: I0201 10:11:07.679730 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qlzcf"] Feb 01 10:11:07 crc kubenswrapper[5127]: I0201 10:11:07.679982 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qlzcf" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="registry-server" containerID="cri-o://b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815" gracePeriod=2 Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.199949 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.280760 5127 generic.go:334] "Generic (PLEG): container finished" podID="596ead02-22f5-4b2a-9c63-41b24465a402" containerID="b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815" exitCode=0 Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.282676 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlzcf" event={"ID":"596ead02-22f5-4b2a-9c63-41b24465a402","Type":"ContainerDied","Data":"b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815"} Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.282718 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qlzcf" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.282746 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qlzcf" event={"ID":"596ead02-22f5-4b2a-9c63-41b24465a402","Type":"ContainerDied","Data":"6c852d73580176959d15682d0b1baf04d2357772b6e6f6db728030eb8d84e0fd"} Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.282774 5127 scope.go:117] "RemoveContainer" containerID="b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.309250 5127 scope.go:117] "RemoveContainer" containerID="d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.310555 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-catalog-content\") pod \"596ead02-22f5-4b2a-9c63-41b24465a402\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.310737 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgm6s\" (UniqueName: \"kubernetes.io/projected/596ead02-22f5-4b2a-9c63-41b24465a402-kube-api-access-fgm6s\") pod \"596ead02-22f5-4b2a-9c63-41b24465a402\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.310867 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-utilities\") pod \"596ead02-22f5-4b2a-9c63-41b24465a402\" (UID: \"596ead02-22f5-4b2a-9c63-41b24465a402\") " Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.313261 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-utilities" (OuterVolumeSpecName: "utilities") pod "596ead02-22f5-4b2a-9c63-41b24465a402" (UID: "596ead02-22f5-4b2a-9c63-41b24465a402"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.319641 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596ead02-22f5-4b2a-9c63-41b24465a402-kube-api-access-fgm6s" (OuterVolumeSpecName: "kube-api-access-fgm6s") pod "596ead02-22f5-4b2a-9c63-41b24465a402" (UID: "596ead02-22f5-4b2a-9c63-41b24465a402"). InnerVolumeSpecName "kube-api-access-fgm6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.374869 5127 scope.go:117] "RemoveContainer" containerID="7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.391423 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "596ead02-22f5-4b2a-9c63-41b24465a402" (UID: "596ead02-22f5-4b2a-9c63-41b24465a402"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.413763 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgm6s\" (UniqueName: \"kubernetes.io/projected/596ead02-22f5-4b2a-9c63-41b24465a402-kube-api-access-fgm6s\") on node \"crc\" DevicePath \"\"" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.413798 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.413809 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/596ead02-22f5-4b2a-9c63-41b24465a402-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.436193 5127 scope.go:117] "RemoveContainer" containerID="b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815" Feb 01 10:11:08 crc kubenswrapper[5127]: E0201 10:11:08.436711 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815\": container with ID starting with b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815 not found: ID does not exist" containerID="b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.436738 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815"} err="failed to get container status \"b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815\": rpc error: code = NotFound desc = could not find container \"b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815\": container with ID starting with b24f06b93e25b40421bfa9de3c701b82a5d0cf646a9753e1fa51aa95939a6815 not found: ID does not exist" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.436757 5127 scope.go:117] "RemoveContainer" containerID="d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2" Feb 01 10:11:08 crc kubenswrapper[5127]: E0201 10:11:08.437038 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2\": container with ID starting with d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2 not found: ID does not exist" containerID="d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.437069 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2"} err="failed to get container status \"d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2\": rpc error: code = NotFound desc = could not find container \"d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2\": container with ID starting with d9763472cfaf01f9f0724e197a2a397e2415949991882e7735567653a7059ba2 not found: ID does not exist" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.437085 5127 scope.go:117] "RemoveContainer" containerID="7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec" Feb 01 10:11:08 crc kubenswrapper[5127]: E0201 10:11:08.437338 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec\": container with ID starting with 7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec not found: ID does not exist" containerID="7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.437377 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec"} err="failed to get container status \"7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec\": rpc error: code = NotFound desc = could not find container \"7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec\": container with ID starting with 7b48e5f3c3cc8c2b626c661b0079915b9df1818c8268082cb2dd20d59ee32cec not found: ID does not exist" Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.616504 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qlzcf"] Feb 01 10:11:08 crc kubenswrapper[5127]: I0201 10:11:08.625941 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qlzcf"] Feb 01 10:11:10 crc kubenswrapper[5127]: I0201 10:11:10.254951 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" path="/var/lib/kubelet/pods/596ead02-22f5-4b2a-9c63-41b24465a402/volumes" Feb 01 10:11:36 crc kubenswrapper[5127]: I0201 10:11:36.741220 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:11:36 crc kubenswrapper[5127]: I0201 10:11:36.741957 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.391106 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p8b46"] Feb 01 10:11:46 crc kubenswrapper[5127]: E0201 10:11:46.392685 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="registry-server" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.392712 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="registry-server" Feb 01 10:11:46 crc kubenswrapper[5127]: E0201 10:11:46.392742 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="extract-content" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.392755 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="extract-content" Feb 01 10:11:46 crc kubenswrapper[5127]: E0201 10:11:46.392788 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="extract-utilities" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.392803 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="extract-utilities" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.393185 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="596ead02-22f5-4b2a-9c63-41b24465a402" containerName="registry-server" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.396069 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.418184 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p8b46"] Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.443501 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw2z4\" (UniqueName: \"kubernetes.io/projected/4111ae87-b2ca-4dd2-95aa-309b962f98b6-kube-api-access-bw2z4\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.444064 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-utilities\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.444382 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-catalog-content\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.545666 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-catalog-content\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.546152 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw2z4\" (UniqueName: \"kubernetes.io/projected/4111ae87-b2ca-4dd2-95aa-309b962f98b6-kube-api-access-bw2z4\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.546467 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-utilities\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.546641 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-catalog-content\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.547285 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-utilities\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.583133 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw2z4\" (UniqueName: \"kubernetes.io/projected/4111ae87-b2ca-4dd2-95aa-309b962f98b6-kube-api-access-bw2z4\") pod \"redhat-operators-p8b46\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:46 crc kubenswrapper[5127]: I0201 10:11:46.736640 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:47 crc kubenswrapper[5127]: I0201 10:11:47.097903 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p8b46"] Feb 01 10:11:47 crc kubenswrapper[5127]: I0201 10:11:47.734273 5127 generic.go:334] "Generic (PLEG): container finished" podID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerID="c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58" exitCode=0 Feb 01 10:11:47 crc kubenswrapper[5127]: I0201 10:11:47.734319 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8b46" event={"ID":"4111ae87-b2ca-4dd2-95aa-309b962f98b6","Type":"ContainerDied","Data":"c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58"} Feb 01 10:11:47 crc kubenswrapper[5127]: I0201 10:11:47.734551 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8b46" event={"ID":"4111ae87-b2ca-4dd2-95aa-309b962f98b6","Type":"ContainerStarted","Data":"c58560cc5347e2a59042b270a9908c69792c147e3dc0f586623bb7c57eecdcb2"} Feb 01 10:11:48 crc kubenswrapper[5127]: I0201 10:11:48.753245 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8b46" event={"ID":"4111ae87-b2ca-4dd2-95aa-309b962f98b6","Type":"ContainerStarted","Data":"57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e"} Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.576287 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-86cf4"] Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.593553 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.644529 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-86cf4"] Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.650944 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g79h\" (UniqueName: \"kubernetes.io/projected/98099c2f-4ef8-475d-8916-aeaf5df1f25e-kube-api-access-9g79h\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.651119 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-utilities\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.651170 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-catalog-content\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.753939 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g79h\" (UniqueName: \"kubernetes.io/projected/98099c2f-4ef8-475d-8916-aeaf5df1f25e-kube-api-access-9g79h\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.754108 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-utilities\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.754159 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-catalog-content\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.755011 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-utilities\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.755034 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-catalog-content\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.779320 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g79h\" (UniqueName: \"kubernetes.io/projected/98099c2f-4ef8-475d-8916-aeaf5df1f25e-kube-api-access-9g79h\") pod \"redhat-marketplace-86cf4\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:50 crc kubenswrapper[5127]: I0201 10:11:50.917698 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:11:51 crc kubenswrapper[5127]: I0201 10:11:51.440638 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-86cf4"] Feb 01 10:11:51 crc kubenswrapper[5127]: I0201 10:11:51.796096 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerStarted","Data":"5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59"} Feb 01 10:11:51 crc kubenswrapper[5127]: I0201 10:11:51.796485 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerStarted","Data":"afdb20c39b83d8c47d96b7558711114523c12b0ecba31caa47ac719e13b85775"} Feb 01 10:11:52 crc kubenswrapper[5127]: I0201 10:11:52.809850 5127 generic.go:334] "Generic (PLEG): container finished" podID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerID="5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59" exitCode=0 Feb 01 10:11:52 crc kubenswrapper[5127]: I0201 10:11:52.809900 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerDied","Data":"5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59"} Feb 01 10:11:52 crc kubenswrapper[5127]: I0201 10:11:52.817459 5127 generic.go:334] "Generic (PLEG): container finished" podID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerID="57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e" exitCode=0 Feb 01 10:11:52 crc kubenswrapper[5127]: I0201 10:11:52.817502 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8b46" event={"ID":"4111ae87-b2ca-4dd2-95aa-309b962f98b6","Type":"ContainerDied","Data":"57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e"} Feb 01 10:11:53 crc kubenswrapper[5127]: I0201 10:11:53.833501 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerStarted","Data":"3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e"} Feb 01 10:11:53 crc kubenswrapper[5127]: I0201 10:11:53.839954 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8b46" event={"ID":"4111ae87-b2ca-4dd2-95aa-309b962f98b6","Type":"ContainerStarted","Data":"898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252"} Feb 01 10:11:53 crc kubenswrapper[5127]: I0201 10:11:53.889781 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p8b46" podStartSLOduration=2.354523341 podStartE2EDuration="7.889758593s" podCreationTimestamp="2026-02-01 10:11:46 +0000 UTC" firstStartedPulling="2026-02-01 10:11:47.73690281 +0000 UTC m=+12258.222805193" lastFinishedPulling="2026-02-01 10:11:53.272138082 +0000 UTC m=+12263.758040445" observedRunningTime="2026-02-01 10:11:53.885033936 +0000 UTC m=+12264.370936339" watchObservedRunningTime="2026-02-01 10:11:53.889758593 +0000 UTC m=+12264.375660996" Feb 01 10:11:54 crc kubenswrapper[5127]: E0201 10:11:54.693877 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98099c2f_4ef8_475d_8916_aeaf5df1f25e.slice/crio-3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e.scope\": RecentStats: unable to find data in memory cache]" Feb 01 10:11:55 crc kubenswrapper[5127]: I0201 10:11:55.875449 5127 generic.go:334] "Generic (PLEG): container finished" podID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerID="3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e" exitCode=0 Feb 01 10:11:55 crc kubenswrapper[5127]: I0201 10:11:55.875500 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerDied","Data":"3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e"} Feb 01 10:11:56 crc kubenswrapper[5127]: I0201 10:11:56.737121 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:56 crc kubenswrapper[5127]: I0201 10:11:56.737482 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:11:56 crc kubenswrapper[5127]: I0201 10:11:56.890068 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerStarted","Data":"02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af"} Feb 01 10:11:56 crc kubenswrapper[5127]: I0201 10:11:56.923685 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-86cf4" podStartSLOduration=3.424310307 podStartE2EDuration="6.92365753s" podCreationTimestamp="2026-02-01 10:11:50 +0000 UTC" firstStartedPulling="2026-02-01 10:11:52.813407997 +0000 UTC m=+12263.299310390" lastFinishedPulling="2026-02-01 10:11:56.31275521 +0000 UTC m=+12266.798657613" observedRunningTime="2026-02-01 10:11:56.907895597 +0000 UTC m=+12267.393797990" watchObservedRunningTime="2026-02-01 10:11:56.92365753 +0000 UTC m=+12267.409559933" Feb 01 10:11:57 crc kubenswrapper[5127]: I0201 10:11:57.788983 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p8b46" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="registry-server" probeResult="failure" output=< Feb 01 10:11:57 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 10:11:57 crc kubenswrapper[5127]: > Feb 01 10:12:00 crc kubenswrapper[5127]: I0201 10:12:00.918829 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:12:00 crc kubenswrapper[5127]: I0201 10:12:00.919214 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:12:01 crc kubenswrapper[5127]: I0201 10:12:01.015122 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:12:01 crc kubenswrapper[5127]: I0201 10:12:01.078804 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:12:01 crc kubenswrapper[5127]: I0201 10:12:01.266318 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-86cf4"] Feb 01 10:12:02 crc kubenswrapper[5127]: I0201 10:12:02.978167 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-86cf4" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="registry-server" containerID="cri-o://02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af" gracePeriod=2 Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.623064 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.776538 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g79h\" (UniqueName: \"kubernetes.io/projected/98099c2f-4ef8-475d-8916-aeaf5df1f25e-kube-api-access-9g79h\") pod \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.776729 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-utilities\") pod \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.776784 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-catalog-content\") pod \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\" (UID: \"98099c2f-4ef8-475d-8916-aeaf5df1f25e\") " Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.778030 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-utilities" (OuterVolumeSpecName: "utilities") pod "98099c2f-4ef8-475d-8916-aeaf5df1f25e" (UID: "98099c2f-4ef8-475d-8916-aeaf5df1f25e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.785351 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98099c2f-4ef8-475d-8916-aeaf5df1f25e-kube-api-access-9g79h" (OuterVolumeSpecName: "kube-api-access-9g79h") pod "98099c2f-4ef8-475d-8916-aeaf5df1f25e" (UID: "98099c2f-4ef8-475d-8916-aeaf5df1f25e"). InnerVolumeSpecName "kube-api-access-9g79h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.800058 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98099c2f-4ef8-475d-8916-aeaf5df1f25e" (UID: "98099c2f-4ef8-475d-8916-aeaf5df1f25e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.880756 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g79h\" (UniqueName: \"kubernetes.io/projected/98099c2f-4ef8-475d-8916-aeaf5df1f25e-kube-api-access-9g79h\") on node \"crc\" DevicePath \"\"" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.881138 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.881157 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98099c2f-4ef8-475d-8916-aeaf5df1f25e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.989482 5127 generic.go:334] "Generic (PLEG): container finished" podID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerID="02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af" exitCode=0 Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.989545 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerDied","Data":"02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af"} Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.989613 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-86cf4" Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.989668 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-86cf4" event={"ID":"98099c2f-4ef8-475d-8916-aeaf5df1f25e","Type":"ContainerDied","Data":"afdb20c39b83d8c47d96b7558711114523c12b0ecba31caa47ac719e13b85775"} Feb 01 10:12:03 crc kubenswrapper[5127]: I0201 10:12:03.989703 5127 scope.go:117] "RemoveContainer" containerID="02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.016739 5127 scope.go:117] "RemoveContainer" containerID="3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.034702 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-86cf4"] Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.072573 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-86cf4"] Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.086697 5127 scope.go:117] "RemoveContainer" containerID="5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.113132 5127 scope.go:117] "RemoveContainer" containerID="02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af" Feb 01 10:12:04 crc kubenswrapper[5127]: E0201 10:12:04.113532 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af\": container with ID starting with 02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af not found: ID does not exist" containerID="02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.113570 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af"} err="failed to get container status \"02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af\": rpc error: code = NotFound desc = could not find container \"02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af\": container with ID starting with 02aa8c9a024556c472ad26fe629f605a3c68d57a1cbd7661e13e32713704b9af not found: ID does not exist" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.113630 5127 scope.go:117] "RemoveContainer" containerID="3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e" Feb 01 10:12:04 crc kubenswrapper[5127]: E0201 10:12:04.113877 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e\": container with ID starting with 3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e not found: ID does not exist" containerID="3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.113967 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e"} err="failed to get container status \"3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e\": rpc error: code = NotFound desc = could not find container \"3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e\": container with ID starting with 3e740969537727393c7768d2a85ee6977813f2faa63fbcf67f503faf10f9fd1e not found: ID does not exist" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.114018 5127 scope.go:117] "RemoveContainer" containerID="5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59" Feb 01 10:12:04 crc kubenswrapper[5127]: E0201 10:12:04.114263 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59\": container with ID starting with 5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59 not found: ID does not exist" containerID="5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.114292 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59"} err="failed to get container status \"5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59\": rpc error: code = NotFound desc = could not find container \"5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59\": container with ID starting with 5ae25e55e7c03e30605b4b92ef606336543edc4acee3d5958f947bb0c84c3b59 not found: ID does not exist" Feb 01 10:12:04 crc kubenswrapper[5127]: I0201 10:12:04.251013 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" path="/var/lib/kubelet/pods/98099c2f-4ef8-475d-8916-aeaf5df1f25e/volumes" Feb 01 10:12:06 crc kubenswrapper[5127]: I0201 10:12:06.741118 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:12:06 crc kubenswrapper[5127]: I0201 10:12:06.741789 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:12:06 crc kubenswrapper[5127]: I0201 10:12:06.741863 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 10:12:06 crc kubenswrapper[5127]: I0201 10:12:06.743149 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a98c5f20862ab64ef78617fa7fef99b3bcd55b0f95c97739ea8f9957bb7da176"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 10:12:06 crc kubenswrapper[5127]: I0201 10:12:06.743257 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://a98c5f20862ab64ef78617fa7fef99b3bcd55b0f95c97739ea8f9957bb7da176" gracePeriod=600 Feb 01 10:12:07 crc kubenswrapper[5127]: I0201 10:12:07.075541 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="a98c5f20862ab64ef78617fa7fef99b3bcd55b0f95c97739ea8f9957bb7da176" exitCode=0 Feb 01 10:12:07 crc kubenswrapper[5127]: I0201 10:12:07.076340 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"a98c5f20862ab64ef78617fa7fef99b3bcd55b0f95c97739ea8f9957bb7da176"} Feb 01 10:12:07 crc kubenswrapper[5127]: I0201 10:12:07.076603 5127 scope.go:117] "RemoveContainer" containerID="239bb0f2b52e2c0ffc7e815c3fc16691207b8b7dbdc7e6069e7574818ea916be" Feb 01 10:12:07 crc kubenswrapper[5127]: I0201 10:12:07.822929 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p8b46" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="registry-server" probeResult="failure" output=< Feb 01 10:12:07 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 10:12:07 crc kubenswrapper[5127]: > Feb 01 10:12:08 crc kubenswrapper[5127]: I0201 10:12:08.093335 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df"} Feb 01 10:12:16 crc kubenswrapper[5127]: I0201 10:12:16.826862 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:12:16 crc kubenswrapper[5127]: I0201 10:12:16.902074 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:12:17 crc kubenswrapper[5127]: I0201 10:12:17.624361 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p8b46"] Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.227727 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p8b46" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="registry-server" containerID="cri-o://898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252" gracePeriod=2 Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.795179 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.877882 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-catalog-content\") pod \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.878019 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw2z4\" (UniqueName: \"kubernetes.io/projected/4111ae87-b2ca-4dd2-95aa-309b962f98b6-kube-api-access-bw2z4\") pod \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.878246 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-utilities\") pod \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\" (UID: \"4111ae87-b2ca-4dd2-95aa-309b962f98b6\") " Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.879262 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-utilities" (OuterVolumeSpecName: "utilities") pod "4111ae87-b2ca-4dd2-95aa-309b962f98b6" (UID: "4111ae87-b2ca-4dd2-95aa-309b962f98b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.882138 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.901852 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4111ae87-b2ca-4dd2-95aa-309b962f98b6-kube-api-access-bw2z4" (OuterVolumeSpecName: "kube-api-access-bw2z4") pod "4111ae87-b2ca-4dd2-95aa-309b962f98b6" (UID: "4111ae87-b2ca-4dd2-95aa-309b962f98b6"). InnerVolumeSpecName "kube-api-access-bw2z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:12:18 crc kubenswrapper[5127]: I0201 10:12:18.983699 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw2z4\" (UniqueName: \"kubernetes.io/projected/4111ae87-b2ca-4dd2-95aa-309b962f98b6-kube-api-access-bw2z4\") on node \"crc\" DevicePath \"\"" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.003598 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4111ae87-b2ca-4dd2-95aa-309b962f98b6" (UID: "4111ae87-b2ca-4dd2-95aa-309b962f98b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.084896 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4111ae87-b2ca-4dd2-95aa-309b962f98b6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.245102 5127 generic.go:334] "Generic (PLEG): container finished" podID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerID="898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252" exitCode=0 Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.245161 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8b46" event={"ID":"4111ae87-b2ca-4dd2-95aa-309b962f98b6","Type":"ContainerDied","Data":"898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252"} Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.245197 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p8b46" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.245233 5127 scope.go:117] "RemoveContainer" containerID="898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.245213 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p8b46" event={"ID":"4111ae87-b2ca-4dd2-95aa-309b962f98b6","Type":"ContainerDied","Data":"c58560cc5347e2a59042b270a9908c69792c147e3dc0f586623bb7c57eecdcb2"} Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.279468 5127 scope.go:117] "RemoveContainer" containerID="57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.313134 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p8b46"] Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.327427 5127 scope.go:117] "RemoveContainer" containerID="c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.330240 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p8b46"] Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.365005 5127 scope.go:117] "RemoveContainer" containerID="898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252" Feb 01 10:12:19 crc kubenswrapper[5127]: E0201 10:12:19.365445 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252\": container with ID starting with 898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252 not found: ID does not exist" containerID="898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.365495 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252"} err="failed to get container status \"898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252\": rpc error: code = NotFound desc = could not find container \"898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252\": container with ID starting with 898428447eed6772e9540c1e0ee42143fac32617d83e0c7650e88551ca2c8252 not found: ID does not exist" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.365522 5127 scope.go:117] "RemoveContainer" containerID="57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e" Feb 01 10:12:19 crc kubenswrapper[5127]: E0201 10:12:19.366087 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e\": container with ID starting with 57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e not found: ID does not exist" containerID="57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.366119 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e"} err="failed to get container status \"57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e\": rpc error: code = NotFound desc = could not find container \"57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e\": container with ID starting with 57c469cfe1363ebb7baf97bca598a38714d388e7c999c0291c4277e1252dcf6e not found: ID does not exist" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.366151 5127 scope.go:117] "RemoveContainer" containerID="c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58" Feb 01 10:12:19 crc kubenswrapper[5127]: E0201 10:12:19.366426 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58\": container with ID starting with c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58 not found: ID does not exist" containerID="c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58" Feb 01 10:12:19 crc kubenswrapper[5127]: I0201 10:12:19.366455 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58"} err="failed to get container status \"c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58\": rpc error: code = NotFound desc = could not find container \"c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58\": container with ID starting with c7978b367af2c26dc0d64b9177a187fe6744a2f33662e79e92765f47538d0b58 not found: ID does not exist" Feb 01 10:12:20 crc kubenswrapper[5127]: I0201 10:12:20.260322 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" path="/var/lib/kubelet/pods/4111ae87-b2ca-4dd2-95aa-309b962f98b6/volumes" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.543143 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4/init-config-reloader/0.log" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.705610 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4/init-config-reloader/0.log" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.714011 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4/config-reloader/0.log" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.733802 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d6d7caf7-f5c8-4c64-883d-d3e8c27c31e4/alertmanager/0.log" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.900292 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0ef0879a-fd11-40c8-aabe-6e26c48c5b5f/aodh-api/0.log" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.916820 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0ef0879a-fd11-40c8-aabe-6e26c48c5b5f/aodh-evaluator/0.log" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.945591 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0ef0879a-fd11-40c8-aabe-6e26c48c5b5f/aodh-listener/0.log" Feb 01 10:13:04 crc kubenswrapper[5127]: I0201 10:13:04.990836 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0ef0879a-fd11-40c8-aabe-6e26c48c5b5f/aodh-notifier/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.132179 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58b7b9b55b-6s7jl_61db7abe-1b12-4dae-884e-86399d0c5f63/barbican-api-log/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.141521 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-58b7b9b55b-6s7jl_61db7abe-1b12-4dae-884e-86399d0c5f63/barbican-api/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.305613 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-df958c9bb-zn2lr_77fe9270-c3c0-4eb4-b8fd-d88d1ab06756/barbican-keystone-listener/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.467771 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6fc8f58b97-kw7wk_c5b2030b-064a-4a95-aa42-7acffae51598/barbican-worker/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.562169 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6fc8f58b97-kw7wk_c5b2030b-064a-4a95-aa42-7acffae51598/barbican-worker-log/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.824107 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-pmwr5_0cd979e7-d370-42d4-a165-9c0792d98a4d/bootstrap-openstack-openstack-cell1/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.968018 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-df958c9bb-zn2lr_77fe9270-c3c0-4eb4-b8fd-d88d1ab06756/barbican-keystone-listener-log/0.log" Feb 01 10:13:05 crc kubenswrapper[5127]: I0201 10:13:05.994802 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-networker-wkc7q_5f10050c-2269-4866-8d6d-70ebc730eca3/bootstrap-openstack-openstack-networker/0.log" Feb 01 10:13:06 crc kubenswrapper[5127]: I0201 10:13:06.089994 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0b56c932-5925-4fd1-b889-86e2d62a41ec/ceilometer-central-agent/0.log" Feb 01 10:13:06 crc kubenswrapper[5127]: I0201 10:13:06.175445 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0b56c932-5925-4fd1-b889-86e2d62a41ec/ceilometer-notification-agent/0.log" Feb 01 10:13:06 crc kubenswrapper[5127]: I0201 10:13:06.225833 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0b56c932-5925-4fd1-b889-86e2d62a41ec/proxy-httpd/0.log" Feb 01 10:13:06 crc kubenswrapper[5127]: I0201 10:13:06.231603 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0b56c932-5925-4fd1-b889-86e2d62a41ec/sg-core/0.log" Feb 01 10:13:06 crc kubenswrapper[5127]: I0201 10:13:06.368976 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-8jrw4_fe0d7dc2-efc9-47da-b525-c38469d4a8ce/ceph-client-openstack-openstack-cell1/0.log" Feb 01 10:13:06 crc kubenswrapper[5127]: I0201 10:13:06.602593 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9d900cca-9100-4348-babc-9c714853bb60/cinder-api/0.log" Feb 01 10:13:06 crc kubenswrapper[5127]: I0201 10:13:06.669969 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9d900cca-9100-4348-babc-9c714853bb60/cinder-api-log/0.log" Feb 01 10:13:07 crc kubenswrapper[5127]: I0201 10:13:07.053167 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a3efe1f7-64a9-45d0-a949-536543461c61/cinder-backup/0.log" Feb 01 10:13:07 crc kubenswrapper[5127]: I0201 10:13:07.084148 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_a3efe1f7-64a9-45d0-a949-536543461c61/probe/0.log" Feb 01 10:13:07 crc kubenswrapper[5127]: I0201 10:13:07.108099 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5b091241-0d4c-4126-b7f9-39cd4a145fd9/cinder-scheduler/0.log" Feb 01 10:13:07 crc kubenswrapper[5127]: I0201 10:13:07.370212 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_5b091241-0d4c-4126-b7f9-39cd4a145fd9/probe/0.log" Feb 01 10:13:07 crc kubenswrapper[5127]: I0201 10:13:07.377894 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_2a245e44-99f1-49ba-b15e-bb4ffd755769/probe/0.log" Feb 01 10:13:07 crc kubenswrapper[5127]: I0201 10:13:07.677412 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-bm985_e605bdd8-d806-44ed-a832-fc7917f53089/configure-network-openstack-openstack-cell1/0.log" Feb 01 10:13:07 crc kubenswrapper[5127]: I0201 10:13:07.923553 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-networker-9mfdv_fbcbec86-272a-408e-9f57-5478b28fe0ed/configure-network-openstack-openstack-networker/0.log" Feb 01 10:13:08 crc kubenswrapper[5127]: I0201 10:13:08.246916 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-hfch5_cf4ccdda-e10a-4a6d-963f-e0c03c30cc1e/configure-os-openstack-openstack-cell1/0.log" Feb 01 10:13:08 crc kubenswrapper[5127]: I0201 10:13:08.280209 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-networker-8bccd_df12e809-8735-4243-9027-a9aefe524c55/configure-os-openstack-openstack-networker/0.log" Feb 01 10:13:08 crc kubenswrapper[5127]: I0201 10:13:08.536120 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f4c47df59-sgbvx_357483ec-5a75-4748-b811-c05f14bf9753/init/0.log" Feb 01 10:13:08 crc kubenswrapper[5127]: I0201 10:13:08.709070 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f4c47df59-sgbvx_357483ec-5a75-4748-b811-c05f14bf9753/init/0.log" Feb 01 10:13:08 crc kubenswrapper[5127]: I0201 10:13:08.773976 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-vxnlw_283b25af-dc74-4623-9e82-90c0a32e6fc5/download-cache-openstack-openstack-cell1/0.log" Feb 01 10:13:08 crc kubenswrapper[5127]: I0201 10:13:08.908505 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5f4c47df59-sgbvx_357483ec-5a75-4748-b811-c05f14bf9753/dnsmasq-dns/0.log" Feb 01 10:13:08 crc kubenswrapper[5127]: I0201 10:13:08.964265 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_2a245e44-99f1-49ba-b15e-bb4ffd755769/cinder-volume/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.028776 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-networker-mm8b7_896963d4-1a35-41cc-81b4-d5695874f82a/download-cache-openstack-openstack-networker/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.221045 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_24109b1d-3e62-4cda-8dfb-8591d4042e6f/glance-httpd/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.237969 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_24109b1d-3e62-4cda-8dfb-8591d4042e6f/glance-log/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.264917 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5704ab50-5657-44ce-bf06-34a2961cbfb3/glance-httpd/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.319792 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5704ab50-5657-44ce-bf06-34a2961cbfb3/glance-log/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.557875 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-79964bdd45-v7lmb_7b587dac-ef38-4834-ae3e-16b2cde5219a/heat-api/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.622666 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-666985764f-pkz2m_0a8e11d4-1e74-4618-a7a2-88e646e3d80d/heat-engine/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.636488 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6bf65bd6f7-jsgn5_09e2ac26-a2d8-42f4-b58b-33adb0156755/heat-cfnapi/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.839555 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-94fcc5cfc-xwtgm_dabbf971-66fb-461f-8c55-4531caf0d644/horizon/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.920794 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-rccdw_f36acb8e-4dbb-4655-9911-5c30e71c1287/install-certs-openstack-openstack-cell1/0.log" Feb 01 10:13:09 crc kubenswrapper[5127]: I0201 10:13:09.994082 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-94fcc5cfc-xwtgm_dabbf971-66fb-461f-8c55-4531caf0d644/horizon-log/0.log" Feb 01 10:13:10 crc kubenswrapper[5127]: I0201 10:13:10.071244 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-networker-52rqc_9eb8c388-3713-407f-8d2c-c8e5bdc446fb/install-certs-openstack-openstack-networker/0.log" Feb 01 10:13:10 crc kubenswrapper[5127]: I0201 10:13:10.230512 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-d6vxn_11fb127c-46a4-421b-aefe-78b11051f499/install-os-openstack-openstack-cell1/0.log" Feb 01 10:13:10 crc kubenswrapper[5127]: I0201 10:13:10.275077 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-networker-dbs22_d9ccb5f5-614a-4b43-972c-20b2b907a88c/install-os-openstack-openstack-networker/0.log" Feb 01 10:13:10 crc kubenswrapper[5127]: I0201 10:13:10.426738 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29499001-jzqw2_2ff2f742-9eb6-450f-af81-ca38b3654694/keystone-cron/0.log" Feb 01 10:13:10 crc kubenswrapper[5127]: I0201 10:13:10.426785 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29498941-gpxpc_d36d9727-403a-4dfe-a3d4-7f5c6c9e08a0/keystone-cron/0.log" Feb 01 10:13:10 crc kubenswrapper[5127]: I0201 10:13:10.642847 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8b0e6a48-aa67-4e6b-9e55-bd312fa6ee25/kube-state-metrics/0.log" Feb 01 10:13:10 crc kubenswrapper[5127]: I0201 10:13:10.986437 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-9ng7k_0a0a2c40-4f67-4b10-b267-5981c37d8253/libvirt-openstack-openstack-cell1/0.log" Feb 01 10:13:11 crc kubenswrapper[5127]: I0201 10:13:11.440411 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-f4f694774-qc4rf_26fde84e-4bfc-4181-8287-e3a1d0ccb81a/keystone-api/0.log" Feb 01 10:13:11 crc kubenswrapper[5127]: I0201 10:13:11.478270 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7d8066ea-58ff-4b2a-84ac-164dcf7197ff/probe/0.log" Feb 01 10:13:11 crc kubenswrapper[5127]: I0201 10:13:11.503400 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_7d8066ea-58ff-4b2a-84ac-164dcf7197ff/manila-scheduler/0.log" Feb 01 10:13:11 crc kubenswrapper[5127]: I0201 10:13:11.633134 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_60a3e644-23c5-4c50-8a26-24c70e5701f1/manila-api/0.log" Feb 01 10:13:11 crc kubenswrapper[5127]: I0201 10:13:11.727333 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_60a3e644-23c5-4c50-8a26-24c70e5701f1/manila-api-log/0.log" Feb 01 10:13:11 crc kubenswrapper[5127]: I0201 10:13:11.776843 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_63c4b3fc-9fd2-4bee-af21-71a7616cf171/probe/0.log" Feb 01 10:13:11 crc kubenswrapper[5127]: I0201 10:13:11.792196 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_63c4b3fc-9fd2-4bee-af21-71a7616cf171/manila-share/0.log" Feb 01 10:13:12 crc kubenswrapper[5127]: I0201 10:13:12.207795 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-rkfr2_8a1da379-3eb9-4fa1-abe1-aabf0654833c/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 01 10:13:12 crc kubenswrapper[5127]: I0201 10:13:12.264881 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7467ffb6b7-d4mz8_eb4805fb-4027-4dd8-980d-cc5004dac4f3/neutron-httpd/0.log" Feb 01 10:13:12 crc kubenswrapper[5127]: I0201 10:13:12.512505 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-gdkvb_9032a10b-4966-4895-a50f-e0d4682049e9/neutron-metadata-openstack-openstack-cell1/0.log" Feb 01 10:13:12 crc kubenswrapper[5127]: I0201 10:13:12.630095 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-networker-s9g4w_56a60225-4a4f-47ae-bb41-a4510a34a915/neutron-metadata-openstack-openstack-networker/0.log" Feb 01 10:13:12 crc kubenswrapper[5127]: I0201 10:13:12.661672 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7467ffb6b7-d4mz8_eb4805fb-4027-4dd8-980d-cc5004dac4f3/neutron-api/0.log" Feb 01 10:13:12 crc kubenswrapper[5127]: I0201 10:13:12.755622 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-74l6h_25c41c63-61a4-4e80-b834-5f90523eb171/neutron-sriov-openstack-openstack-cell1/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.070864 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_dbf60a1d-4462-418a-8ee6-23da577357fe/nova-api-api/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.242935 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ab1f6e96-255f-4472-b0ac-1a712d4b40a2/nova-cell0-conductor-conductor/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.433748 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f0f1c414-d0df-4128-9b09-b5a2028f3454/nova-cell1-conductor-conductor/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.454797 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_dbf60a1d-4462-418a-8ee6-23da577357fe/nova-api-log/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.599093 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_38794c77-9e2e-450a-832c-5e913b09350a/nova-cell1-novncproxy-novncproxy/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.746689 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellncfvz_7bea9abe-b0b7-41af-bac0-07d4f854c1d6/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.899038 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-6ct2s_30b9dbac-2336-4eab-8221-09ef2c34d3a7/nova-cell1-openstack-openstack-cell1/0.log" Feb 01 10:13:13 crc kubenswrapper[5127]: I0201 10:13:13.947999 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df602c4b-e8eb-4c9e-b855-6196b51eebe5/nova-metadata-log/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.099167 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_df602c4b-e8eb-4c9e-b855-6196b51eebe5/nova-metadata-metadata/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.240449 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6063c9cb-f98d-44a8-863d-0ac61cd4257c/nova-scheduler-scheduler/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.320111 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d42fff15-1ed7-468f-bf75-609929079667/mysql-bootstrap/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.471666 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d42fff15-1ed7-468f-bf75-609929079667/galera/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.521308 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d42fff15-1ed7-468f-bf75-609929079667/mysql-bootstrap/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.557050 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_82cefe36-3a26-4a7e-add3-b445cb590fe5/mysql-bootstrap/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.813305 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_82cefe36-3a26-4a7e-add3-b445cb590fe5/galera/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.833287 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_82cefe36-3a26-4a7e-add3-b445cb590fe5/mysql-bootstrap/0.log" Feb 01 10:13:14 crc kubenswrapper[5127]: I0201 10:13:14.857570 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_6dc16ed6-1e14-435e-bafa-e151fce8a2bd/openstackclient/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.052057 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eda4fd4c-e6bd-44ab-8790-d65b2e2054a6/ovn-northd/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.096333 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_eda4fd4c-e6bd-44ab-8790-d65b2e2054a6/openstack-network-exporter/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.528105 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-b4fz9_36c41840-5738-4cb4-973b-c8d38371cfcb/ovn-openstack-openstack-cell1/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.573534 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-networker-dkmwg_c0c7f749-f26b-40b4-bbc1-38446be4a68d/ovn-openstack-openstack-networker/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.722422 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d/ovsdbserver-nb/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.811689 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ac5ceef4-919c-4a17-af7e-6ae27bfc5a3d/openstack-network-exporter/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.894907 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_89ca6150-db6e-4770-88c3-d495682edb2e/openstack-network-exporter/0.log" Feb 01 10:13:15 crc kubenswrapper[5127]: I0201 10:13:15.956911 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_89ca6150-db6e-4770-88c3-d495682edb2e/ovsdbserver-nb/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.109631 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f8e5bfb6-7976-4e24-89d4-840cba014b37/openstack-network-exporter/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.130732 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_f8e5bfb6-7976-4e24-89d4-840cba014b37/ovsdbserver-nb/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.275784 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4775eb36-bd2b-417d-8156-2628ece4a87a/openstack-network-exporter/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.358417 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4775eb36-bd2b-417d-8156-2628ece4a87a/ovsdbserver-sb/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.488458 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fed45656-ff80-4f32-aa27-26223fa85bf5/openstack-network-exporter/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.671067 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_6846c586-1fd9-443a-9317-33037c64e831/openstack-network-exporter/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.675232 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_fed45656-ff80-4f32-aa27-26223fa85bf5/ovsdbserver-sb/0.log" Feb 01 10:13:16 crc kubenswrapper[5127]: I0201 10:13:16.843505 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_6846c586-1fd9-443a-9317-33037c64e831/ovsdbserver-sb/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.110462 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575cc7d444-lrng8_4f22699e-8bf9-4bd5-8b56-6f2cab072f17/placement-api/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.234688 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c6fzgd_d0e75902-bec9-4e02-851f-b2a306355649/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.249630 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-575cc7d444-lrng8_4f22699e-8bf9-4bd5-8b56-6f2cab072f17/placement-log/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.305710 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-ndspfv_4c4da666-5d8e-4fee-b952-2b45b93011ad/pre-adoption-validation-openstack-pre-adoption-openstack-networ/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.463830 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ee773d63-9213-4566-a575-e86b796fa167/init-config-reloader/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.645089 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ee773d63-9213-4566-a575-e86b796fa167/config-reloader/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.649489 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ee773d63-9213-4566-a575-e86b796fa167/thanos-sidecar/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.708885 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ee773d63-9213-4566-a575-e86b796fa167/init-config-reloader/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.732892 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_ee773d63-9213-4566-a575-e86b796fa167/prometheus/0.log" Feb 01 10:13:17 crc kubenswrapper[5127]: I0201 10:13:17.915986 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4052af64-63c0-4e94-bcb9-96463c2e98ce/setup-container/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.098794 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4052af64-63c0-4e94-bcb9-96463c2e98ce/setup-container/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.140891 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4052af64-63c0-4e94-bcb9-96463c2e98ce/rabbitmq/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.195240 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6bb04d8b-7336-442e-ab61-a9a207787027/setup-container/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.429759 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-bdbff_94b4f8d3-30e6-4ded-8c7b-6efec8dc4f69/reboot-os-openstack-openstack-cell1/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.465523 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6bb04d8b-7336-442e-ab61-a9a207787027/setup-container/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.765325 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6bb04d8b-7336-442e-ab61-a9a207787027/rabbitmq/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.780964 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-networker-58kk7_48821e45-5c97-4162-8c43-259f2d4b6a7c/reboot-os-openstack-openstack-networker/0.log" Feb 01 10:13:18 crc kubenswrapper[5127]: I0201 10:13:18.893099 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-95565_0b38547b-b0d5-4063-8e6b-3dfe5977677f/run-os-openstack-openstack-cell1/0.log" Feb 01 10:13:19 crc kubenswrapper[5127]: I0201 10:13:19.068624 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-networker-tfsrs_2f220fdd-7457-4ef7-8314-727210a50eda/run-os-openstack-openstack-networker/0.log" Feb 01 10:13:19 crc kubenswrapper[5127]: I0201 10:13:19.142617 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-fjs75_7cf892b5-5638-474a-b426-aa1d7b4952af/ssh-known-hosts-openstack/0.log" Feb 01 10:13:19 crc kubenswrapper[5127]: I0201 10:13:19.494722 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-qdr6s_31d637d0-729b-42fe-8cbc-1ed2c449c2b3/telemetry-openstack-openstack-cell1/0.log" Feb 01 10:13:19 crc kubenswrapper[5127]: I0201 10:13:19.513813 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_068a067f-bb12-4a63-a3ed-7eb05da0ca52/tempest-tests-tempest-tests-runner/0.log" Feb 01 10:13:19 crc kubenswrapper[5127]: I0201 10:13:19.535337 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_ead6462a-b14a-4cac-820a-67b142ddfc86/test-operator-logs-container/0.log" Feb 01 10:13:20 crc kubenswrapper[5127]: I0201 10:13:20.005173 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-hmlfc_833a198f-222c-4ce9-a629-f1138fbd1fce/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 01 10:13:20 crc kubenswrapper[5127]: I0201 10:13:20.156104 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-networker-sdxvg_2017b1bc-65d4-4031-a56c-8aa5399a6446/tripleo-cleanup-tripleo-cleanup-openstack-networker/0.log" Feb 01 10:13:20 crc kubenswrapper[5127]: I0201 10:13:20.259605 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-vhpqj_7b6d0276-99a8-40e9-8c7a-53633dc1c58f/validate-network-openstack-openstack-cell1/0.log" Feb 01 10:13:20 crc kubenswrapper[5127]: I0201 10:13:20.407378 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-networker-8n9vd_e8c396ea-90b7-4ace-962d-2dfa97f0488a/validate-network-openstack-openstack-networker/0.log" Feb 01 10:13:31 crc kubenswrapper[5127]: I0201 10:13:31.038211 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_2bc6b5e9-8c7b-4144-b943-e57514d1f11f/memcached/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.029613 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz_608c77ac-5d08-4c81-803d-b9aed3ba9d73/util/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.303066 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz_608c77ac-5d08-4c81-803d-b9aed3ba9d73/pull/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.309108 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz_608c77ac-5d08-4c81-803d-b9aed3ba9d73/pull/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.379879 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz_608c77ac-5d08-4c81-803d-b9aed3ba9d73/util/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.562549 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz_608c77ac-5d08-4c81-803d-b9aed3ba9d73/util/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.580729 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz_608c77ac-5d08-4c81-803d-b9aed3ba9d73/pull/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.650797 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b6f0ceecaefb29a8e5801b760101e31cf6295f8d10236ea1e93fc043d1glbtz_608c77ac-5d08-4c81-803d-b9aed3ba9d73/extract/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.919970 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-msn79_41b36a3a-ccdb-4db2-b23b-110fdd81e06b/manager/0.log" Feb 01 10:13:48 crc kubenswrapper[5127]: I0201 10:13:48.925358 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-5qgzf_e4b9555b-b0a0-48c0-a488-2fa76ba13e19/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.042403 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-c5r2v_155e3129-7d47-4eef-ae17-445a4847e3c4/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.245008 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-sjjbc_dfba9a52-4b08-4001-bec8-0faf57fb61a0/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.285559 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-lj688_d5b966a4-4cc2-4ed9-a4fb-3f2c0124306d/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.296319 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-k59z5_020efd87-3f4e-4762-9853-4f08d7f744cd/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.485804 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-kssvc_b14e4493-339f-480c-84eb-7be38d967aef/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.863528 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-v8fwx_6c57a02e-b635-4a20-921e-fc1ce29bd6e1/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.933961 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-gv8nn_a050d4cf-e8ae-4983-aeef-5504bd4ffdc3/manager/0.log" Feb 01 10:13:49 crc kubenswrapper[5127]: I0201 10:13:49.978280 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-hv9b6_181c451d-b9c2-4b75-b271-a3d33fc7c200/manager/0.log" Feb 01 10:13:50 crc kubenswrapper[5127]: I0201 10:13:50.130142 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-ll87b_60763bd0-4a99-48da-b53f-1dfddcfd2dda/manager/0.log" Feb 01 10:13:50 crc kubenswrapper[5127]: I0201 10:13:50.200188 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-2lrrd_f972515b-c0d8-497e-87e8-ec5a8f3e4151/manager/0.log" Feb 01 10:13:50 crc kubenswrapper[5127]: I0201 10:13:50.427782 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-kwkzq_cb22bae8-ed04-41f1-8061-149713da4d9f/manager/0.log" Feb 01 10:13:50 crc kubenswrapper[5127]: I0201 10:13:50.506610 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-wgpr9_be2cd21c-5775-450d-9933-9914e99730a6/manager/0.log" Feb 01 10:13:50 crc kubenswrapper[5127]: I0201 10:13:50.631096 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86dfb79cc7c6xhf_a6886643-fe68-466d-ab2f-0dfd752dbe0f/manager/0.log" Feb 01 10:13:50 crc kubenswrapper[5127]: I0201 10:13:50.740203 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-757f46c65d-d2mn9_3d5d76ba-0b08-4c6e-b89d-fa438a766a13/operator/0.log" Feb 01 10:13:51 crc kubenswrapper[5127]: I0201 10:13:51.279767 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-6z59p_22ada428-9d6c-41dc-8c3f-a6684d72f4b3/manager/0.log" Feb 01 10:13:51 crc kubenswrapper[5127]: I0201 10:13:51.290950 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-m9nqs_7b4c130e-a6b2-4e51-a25f-044db714852e/registry-server/0.log" Feb 01 10:13:51 crc kubenswrapper[5127]: I0201 10:13:51.540667 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-5wvqj_71b843c8-50d1-4b1b-83ca-33d72bb16b5e/manager/0.log" Feb 01 10:13:51 crc kubenswrapper[5127]: I0201 10:13:51.594403 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-bbcxg_144ee88b-a5c7-46da-9e39-8f3c71d9499d/operator/0.log" Feb 01 10:13:52 crc kubenswrapper[5127]: I0201 10:13:52.026192 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-646f6_464ffa34-bb5b-4e78-9fa1-d106fd67de1d/manager/0.log" Feb 01 10:13:52 crc kubenswrapper[5127]: I0201 10:13:52.271631 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-9r2rh_43e73360-cfda-420c-8df1-fe2e50b31d0c/manager/0.log" Feb 01 10:13:52 crc kubenswrapper[5127]: I0201 10:13:52.293120 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-c89bv_c25dac7e-0533-4ac5-9fc8-cabbf5e340bc/manager/0.log" Feb 01 10:13:52 crc kubenswrapper[5127]: I0201 10:13:52.404550 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-97rgn_ec968356-989e-4e17-b755-66c8a2b8109a/manager/0.log" Feb 01 10:13:53 crc kubenswrapper[5127]: I0201 10:13:53.244188 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b6f655c79-8hfvm_7205026a-cd8b-4d94-9581-9fb0b21c5c4c/manager/0.log" Feb 01 10:14:14 crc kubenswrapper[5127]: I0201 10:14:14.545056 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pdls2_12e95361-86af-4218-b7f7-56582c3a17b7/control-plane-machine-set-operator/0.log" Feb 01 10:14:14 crc kubenswrapper[5127]: I0201 10:14:14.935822 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kvg79_2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c/kube-rbac-proxy/0.log" Feb 01 10:14:14 crc kubenswrapper[5127]: I0201 10:14:14.952852 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kvg79_2c9cfc0f-f3e7-4a7a-860a-5961cfa3ab6c/machine-api-operator/0.log" Feb 01 10:14:30 crc kubenswrapper[5127]: I0201 10:14:30.327143 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-grq4k_bab839cb-ff86-4576-8ff7-7de82e0f6757/cert-manager-controller/0.log" Feb 01 10:14:30 crc kubenswrapper[5127]: I0201 10:14:30.526384 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-dhn4h_2eca325f-b927-4e7b-8500-875f68594b7e/cert-manager-cainjector/0.log" Feb 01 10:14:30 crc kubenswrapper[5127]: I0201 10:14:30.534146 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-7nfk8_42257850-8b63-4bc9-a884-ead44b084bf1/cert-manager-webhook/0.log" Feb 01 10:14:36 crc kubenswrapper[5127]: I0201 10:14:36.740815 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:14:36 crc kubenswrapper[5127]: I0201 10:14:36.741176 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:14:46 crc kubenswrapper[5127]: I0201 10:14:46.838966 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-m6qwl_1300e867-c21d-4450-bfa2-24c0ab3c8a21/nmstate-console-plugin/0.log" Feb 01 10:14:46 crc kubenswrapper[5127]: I0201 10:14:46.988867 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qmpft_eda42d6f-e70e-40d5-98f4-1329398803ae/nmstate-handler/0.log" Feb 01 10:14:47 crc kubenswrapper[5127]: I0201 10:14:47.116883 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-zlwp8_1f8bacee-d920-4264-bfc2-249be9f4c352/nmstate-metrics/0.log" Feb 01 10:14:47 crc kubenswrapper[5127]: I0201 10:14:47.126054 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-zlwp8_1f8bacee-d920-4264-bfc2-249be9f4c352/kube-rbac-proxy/0.log" Feb 01 10:14:47 crc kubenswrapper[5127]: I0201 10:14:47.260384 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-97zr9_0cbcc60e-a163-42c1-871e-fd30c9d8e0f8/nmstate-operator/0.log" Feb 01 10:14:47 crc kubenswrapper[5127]: I0201 10:14:47.353996 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-zpcmk_2049e4f2-f393-4ce1-bd69-33f34b97b2a8/nmstate-webhook/0.log" Feb 01 10:14:52 crc kubenswrapper[5127]: I0201 10:14:52.295911 5127 scope.go:117] "RemoveContainer" containerID="0f00151e818c72264f8fe0fc1b3513537370c9031dc6a9bbe18578190c88d9e7" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.162675 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk"] Feb 01 10:15:00 crc kubenswrapper[5127]: E0201 10:15:00.163556 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="extract-utilities" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163569 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="extract-utilities" Feb 01 10:15:00 crc kubenswrapper[5127]: E0201 10:15:00.163603 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="registry-server" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163610 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="registry-server" Feb 01 10:15:00 crc kubenswrapper[5127]: E0201 10:15:00.163624 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="extract-content" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163631 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="extract-content" Feb 01 10:15:00 crc kubenswrapper[5127]: E0201 10:15:00.163645 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="extract-content" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163652 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="extract-content" Feb 01 10:15:00 crc kubenswrapper[5127]: E0201 10:15:00.163657 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="registry-server" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163663 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="registry-server" Feb 01 10:15:00 crc kubenswrapper[5127]: E0201 10:15:00.163676 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="extract-utilities" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163681 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="extract-utilities" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163859 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="4111ae87-b2ca-4dd2-95aa-309b962f98b6" containerName="registry-server" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.163872 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="98099c2f-4ef8-475d-8916-aeaf5df1f25e" containerName="registry-server" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.164555 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.172077 5127 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.172719 5127 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.174596 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk"] Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.269137 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1929df62-f872-439c-8fcf-4c39dfdf4228-secret-volume\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.269540 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1929df62-f872-439c-8fcf-4c39dfdf4228-config-volume\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.269818 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khnzm\" (UniqueName: \"kubernetes.io/projected/1929df62-f872-439c-8fcf-4c39dfdf4228-kube-api-access-khnzm\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.371674 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1929df62-f872-439c-8fcf-4c39dfdf4228-secret-volume\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.371902 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1929df62-f872-439c-8fcf-4c39dfdf4228-config-volume\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.372227 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khnzm\" (UniqueName: \"kubernetes.io/projected/1929df62-f872-439c-8fcf-4c39dfdf4228-kube-api-access-khnzm\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.373789 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1929df62-f872-439c-8fcf-4c39dfdf4228-config-volume\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.390834 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1929df62-f872-439c-8fcf-4c39dfdf4228-secret-volume\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.395777 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khnzm\" (UniqueName: \"kubernetes.io/projected/1929df62-f872-439c-8fcf-4c39dfdf4228-kube-api-access-khnzm\") pod \"collect-profiles-29499015-m2qsk\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.509638 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:00 crc kubenswrapper[5127]: I0201 10:15:00.977218 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk"] Feb 01 10:15:01 crc kubenswrapper[5127]: I0201 10:15:01.373293 5127 generic.go:334] "Generic (PLEG): container finished" podID="1929df62-f872-439c-8fcf-4c39dfdf4228" containerID="5b7699797fcbf755ef3a6d5f3f579ae458b92508818fa5a616bc1ebb62c7e3c8" exitCode=0 Feb 01 10:15:01 crc kubenswrapper[5127]: I0201 10:15:01.373444 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" event={"ID":"1929df62-f872-439c-8fcf-4c39dfdf4228","Type":"ContainerDied","Data":"5b7699797fcbf755ef3a6d5f3f579ae458b92508818fa5a616bc1ebb62c7e3c8"} Feb 01 10:15:01 crc kubenswrapper[5127]: I0201 10:15:01.373670 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" event={"ID":"1929df62-f872-439c-8fcf-4c39dfdf4228","Type":"ContainerStarted","Data":"7ed117ed2cb6448efae0a3d7e679ff3c72d0ad1d3ed175385d568b90204c962b"} Feb 01 10:15:02 crc kubenswrapper[5127]: I0201 10:15:02.826057 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:02 crc kubenswrapper[5127]: I0201 10:15:02.938266 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khnzm\" (UniqueName: \"kubernetes.io/projected/1929df62-f872-439c-8fcf-4c39dfdf4228-kube-api-access-khnzm\") pod \"1929df62-f872-439c-8fcf-4c39dfdf4228\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " Feb 01 10:15:02 crc kubenswrapper[5127]: I0201 10:15:02.938442 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1929df62-f872-439c-8fcf-4c39dfdf4228-config-volume\") pod \"1929df62-f872-439c-8fcf-4c39dfdf4228\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " Feb 01 10:15:02 crc kubenswrapper[5127]: I0201 10:15:02.938551 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1929df62-f872-439c-8fcf-4c39dfdf4228-secret-volume\") pod \"1929df62-f872-439c-8fcf-4c39dfdf4228\" (UID: \"1929df62-f872-439c-8fcf-4c39dfdf4228\") " Feb 01 10:15:02 crc kubenswrapper[5127]: I0201 10:15:02.940551 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1929df62-f872-439c-8fcf-4c39dfdf4228-config-volume" (OuterVolumeSpecName: "config-volume") pod "1929df62-f872-439c-8fcf-4c39dfdf4228" (UID: "1929df62-f872-439c-8fcf-4c39dfdf4228"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 10:15:02 crc kubenswrapper[5127]: I0201 10:15:02.945741 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1929df62-f872-439c-8fcf-4c39dfdf4228-kube-api-access-khnzm" (OuterVolumeSpecName: "kube-api-access-khnzm") pod "1929df62-f872-439c-8fcf-4c39dfdf4228" (UID: "1929df62-f872-439c-8fcf-4c39dfdf4228"). InnerVolumeSpecName "kube-api-access-khnzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:15:02 crc kubenswrapper[5127]: I0201 10:15:02.965290 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1929df62-f872-439c-8fcf-4c39dfdf4228-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1929df62-f872-439c-8fcf-4c39dfdf4228" (UID: "1929df62-f872-439c-8fcf-4c39dfdf4228"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.041091 5127 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1929df62-f872-439c-8fcf-4c39dfdf4228-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.041133 5127 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1929df62-f872-439c-8fcf-4c39dfdf4228-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.041148 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khnzm\" (UniqueName: \"kubernetes.io/projected/1929df62-f872-439c-8fcf-4c39dfdf4228-kube-api-access-khnzm\") on node \"crc\" DevicePath \"\"" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.398645 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" event={"ID":"1929df62-f872-439c-8fcf-4c39dfdf4228","Type":"ContainerDied","Data":"7ed117ed2cb6448efae0a3d7e679ff3c72d0ad1d3ed175385d568b90204c962b"} Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.398685 5127 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ed117ed2cb6448efae0a3d7e679ff3c72d0ad1d3ed175385d568b90204c962b" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.398716 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499015-m2qsk" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.646902 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-fl5c9_b9cff7c9-e3d3-41cb-8b79-76cca738c2f6/prometheus-operator/0.log" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.916173 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm_5d9280e1-7d78-466f-a218-29bc52ab31d5/prometheus-operator-admission-webhook/0.log" Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.925565 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs"] Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.943309 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29498970-8gmxs"] Feb 01 10:15:03 crc kubenswrapper[5127]: I0201 10:15:03.970692 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl_64f4b3f8-dc3b-44c5-ab17-51cec08322b0/prometheus-operator-admission-webhook/0.log" Feb 01 10:15:04 crc kubenswrapper[5127]: I0201 10:15:04.117257 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-b4zck_c15e3a2f-7b85-439d-8fcc-9108c58e7a9e/operator/0.log" Feb 01 10:15:04 crc kubenswrapper[5127]: I0201 10:15:04.152724 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4c7dx_7e9485f5-0c8c-40cb-88de-fae715ae2f3f/perses-operator/0.log" Feb 01 10:15:04 crc kubenswrapper[5127]: I0201 10:15:04.246297 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d20aa4-8168-4dc3-8b16-a3d21ca6b45d" path="/var/lib/kubelet/pods/02d20aa4-8168-4dc3-8b16-a3d21ca6b45d/volumes" Feb 01 10:15:06 crc kubenswrapper[5127]: I0201 10:15:06.741259 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:15:06 crc kubenswrapper[5127]: I0201 10:15:06.742134 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.112601 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-wgvph_516e16c5-a825-4ea6-a093-91b77dedc874/kube-rbac-proxy/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.363118 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-wgvph_516e16c5-a825-4ea6-a093-91b77dedc874/controller/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.396241 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-frr-files/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.514563 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-frr-files/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.520413 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-reloader/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.568197 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-reloader/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.578456 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-metrics/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.755255 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-metrics/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.757468 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-frr-files/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.775667 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-metrics/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.782358 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-reloader/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.922379 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-frr-files/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.970767 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-metrics/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.973662 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/controller/0.log" Feb 01 10:15:20 crc kubenswrapper[5127]: I0201 10:15:20.974722 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/cp-reloader/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.126860 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/frr-metrics/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.159808 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/kube-rbac-proxy/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.184413 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/kube-rbac-proxy-frr/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.319950 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/reloader/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.366102 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-x4g6r_a116c9d4-3422-4263-83ab-dd00009d9603/frr-k8s-webhook-server/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.628184 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-65f7457996-dlrps_6559471d-0983-456e-9890-5997a2923dd8/manager/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.758722 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74f77b7fdf-vs6dd_3f1c209c-f9db-4d28-a248-dfd64c611455/webhook-server/0.log" Feb 01 10:15:21 crc kubenswrapper[5127]: I0201 10:15:21.878710 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5gpqk_438513cf-4480-46c2-b82e-ca515e475e06/kube-rbac-proxy/0.log" Feb 01 10:15:22 crc kubenswrapper[5127]: I0201 10:15:22.665112 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5gpqk_438513cf-4480-46c2-b82e-ca515e475e06/speaker/0.log" Feb 01 10:15:24 crc kubenswrapper[5127]: I0201 10:15:24.202029 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vpnfg_1db2b7d1-a38e-4b50-8b1e-a30a5d59608a/frr/0.log" Feb 01 10:15:36 crc kubenswrapper[5127]: I0201 10:15:36.740778 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:15:36 crc kubenswrapper[5127]: I0201 10:15:36.741457 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:15:36 crc kubenswrapper[5127]: I0201 10:15:36.741518 5127 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" Feb 01 10:15:36 crc kubenswrapper[5127]: I0201 10:15:36.743209 5127 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df"} pod="openshift-machine-config-operator/machine-config-daemon-s2frk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 10:15:36 crc kubenswrapper[5127]: I0201 10:15:36.743315 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" containerID="cri-o://8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" gracePeriod=600 Feb 01 10:15:36 crc kubenswrapper[5127]: E0201 10:15:36.870050 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:15:37 crc kubenswrapper[5127]: I0201 10:15:37.643553 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52_e86fe952-2a5c-4a01-b82e-53ba47fc92c8/util/0.log" Feb 01 10:15:37 crc kubenswrapper[5127]: I0201 10:15:37.808779 5127 generic.go:334] "Generic (PLEG): container finished" podID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" exitCode=0 Feb 01 10:15:37 crc kubenswrapper[5127]: I0201 10:15:37.808822 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerDied","Data":"8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df"} Feb 01 10:15:37 crc kubenswrapper[5127]: I0201 10:15:37.808855 5127 scope.go:117] "RemoveContainer" containerID="a98c5f20862ab64ef78617fa7fef99b3bcd55b0f95c97739ea8f9957bb7da176" Feb 01 10:15:37 crc kubenswrapper[5127]: I0201 10:15:37.810050 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:15:37 crc kubenswrapper[5127]: E0201 10:15:37.810527 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:15:37 crc kubenswrapper[5127]: I0201 10:15:37.948353 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52_e86fe952-2a5c-4a01-b82e-53ba47fc92c8/pull/0.log" Feb 01 10:15:37 crc kubenswrapper[5127]: I0201 10:15:37.988618 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52_e86fe952-2a5c-4a01-b82e-53ba47fc92c8/util/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.006292 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52_e86fe952-2a5c-4a01-b82e-53ba47fc92c8/pull/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.202199 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52_e86fe952-2a5c-4a01-b82e-53ba47fc92c8/extract/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.210397 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52_e86fe952-2a5c-4a01-b82e-53ba47fc92c8/pull/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.228843 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc6jb52_e86fe952-2a5c-4a01-b82e-53ba47fc92c8/util/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.353680 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw_34f9a5ce-8747-43ec-827e-8392c57165df/util/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.528245 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw_34f9a5ce-8747-43ec-827e-8392c57165df/pull/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.562707 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw_34f9a5ce-8747-43ec-827e-8392c57165df/util/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.578657 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw_34f9a5ce-8747-43ec-827e-8392c57165df/pull/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.788965 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw_34f9a5ce-8747-43ec-827e-8392c57165df/util/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.799512 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw_34f9a5ce-8747-43ec-827e-8392c57165df/pull/0.log" Feb 01 10:15:38 crc kubenswrapper[5127]: I0201 10:15:38.815234 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71345pnw_34f9a5ce-8747-43ec-827e-8392c57165df/extract/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.010791 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c_b6b15690-b95f-417f-956b-78ad11c53bb2/util/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.166079 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c_b6b15690-b95f-417f-956b-78ad11c53bb2/util/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.185443 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c_b6b15690-b95f-417f-956b-78ad11c53bb2/pull/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.201047 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c_b6b15690-b95f-417f-956b-78ad11c53bb2/pull/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.292739 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c_b6b15690-b95f-417f-956b-78ad11c53bb2/util/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.345573 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c_b6b15690-b95f-417f-956b-78ad11c53bb2/pull/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.350463 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xn27c_b6b15690-b95f-417f-956b-78ad11c53bb2/extract/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.480849 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd_e9aacae9-b0d2-4661-8e56-52e562125b03/util/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.626767 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd_e9aacae9-b0d2-4661-8e56-52e562125b03/util/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.662885 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd_e9aacae9-b0d2-4661-8e56-52e562125b03/pull/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.672889 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd_e9aacae9-b0d2-4661-8e56-52e562125b03/pull/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.849904 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd_e9aacae9-b0d2-4661-8e56-52e562125b03/util/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.870803 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd_e9aacae9-b0d2-4661-8e56-52e562125b03/pull/0.log" Feb 01 10:15:39 crc kubenswrapper[5127]: I0201 10:15:39.871808 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08wdtqd_e9aacae9-b0d2-4661-8e56-52e562125b03/extract/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.027434 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xx2dx_96661957-517e-40fd-a208-c0cf8c58c34c/extract-utilities/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.177955 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xx2dx_96661957-517e-40fd-a208-c0cf8c58c34c/extract-content/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.198920 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xx2dx_96661957-517e-40fd-a208-c0cf8c58c34c/extract-content/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.216284 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xx2dx_96661957-517e-40fd-a208-c0cf8c58c34c/extract-utilities/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.410119 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xx2dx_96661957-517e-40fd-a208-c0cf8c58c34c/extract-utilities/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.437287 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xx2dx_96661957-517e-40fd-a208-c0cf8c58c34c/extract-content/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.504394 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xx2dx_96661957-517e-40fd-a208-c0cf8c58c34c/registry-server/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.598663 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khrvl_884c63f6-518e-4e8d-9321-0e4ba5668310/extract-utilities/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.819439 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khrvl_884c63f6-518e-4e8d-9321-0e4ba5668310/extract-content/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.823891 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khrvl_884c63f6-518e-4e8d-9321-0e4ba5668310/extract-content/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.834289 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khrvl_884c63f6-518e-4e8d-9321-0e4ba5668310/extract-utilities/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.968652 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khrvl_884c63f6-518e-4e8d-9321-0e4ba5668310/extract-utilities/0.log" Feb 01 10:15:40 crc kubenswrapper[5127]: I0201 10:15:40.981327 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khrvl_884c63f6-518e-4e8d-9321-0e4ba5668310/extract-content/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.235107 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xqbdb_38271625-b7ec-4011-b426-b4ec1a5bb669/extract-utilities/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.324646 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9bpzr_68e6e2cf-dc33-488e-8308-928b146d9aa3/marketplace-operator/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.477872 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xqbdb_38271625-b7ec-4011-b426-b4ec1a5bb669/extract-content/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.484315 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xqbdb_38271625-b7ec-4011-b426-b4ec1a5bb669/extract-content/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.531793 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xqbdb_38271625-b7ec-4011-b426-b4ec1a5bb669/extract-utilities/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.716490 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xqbdb_38271625-b7ec-4011-b426-b4ec1a5bb669/extract-utilities/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.766737 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xqbdb_38271625-b7ec-4011-b426-b4ec1a5bb669/extract-content/0.log" Feb 01 10:15:41 crc kubenswrapper[5127]: I0201 10:15:41.913097 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2tgc4_2a50674b-ac62-4f1e-9be7-fe427860937e/extract-utilities/0.log" Feb 01 10:15:42 crc kubenswrapper[5127]: I0201 10:15:42.114874 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2tgc4_2a50674b-ac62-4f1e-9be7-fe427860937e/extract-utilities/0.log" Feb 01 10:15:42 crc kubenswrapper[5127]: I0201 10:15:42.125956 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2tgc4_2a50674b-ac62-4f1e-9be7-fe427860937e/extract-content/0.log" Feb 01 10:15:42 crc kubenswrapper[5127]: I0201 10:15:42.153250 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2tgc4_2a50674b-ac62-4f1e-9be7-fe427860937e/extract-content/0.log" Feb 01 10:15:42 crc kubenswrapper[5127]: I0201 10:15:42.246013 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-khrvl_884c63f6-518e-4e8d-9321-0e4ba5668310/registry-server/0.log" Feb 01 10:15:42 crc kubenswrapper[5127]: I0201 10:15:42.286831 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xqbdb_38271625-b7ec-4011-b426-b4ec1a5bb669/registry-server/0.log" Feb 01 10:15:42 crc kubenswrapper[5127]: I0201 10:15:42.304086 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2tgc4_2a50674b-ac62-4f1e-9be7-fe427860937e/extract-utilities/0.log" Feb 01 10:15:42 crc kubenswrapper[5127]: I0201 10:15:42.374909 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2tgc4_2a50674b-ac62-4f1e-9be7-fe427860937e/extract-content/0.log" Feb 01 10:15:43 crc kubenswrapper[5127]: I0201 10:15:43.684418 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-2tgc4_2a50674b-ac62-4f1e-9be7-fe427860937e/registry-server/0.log" Feb 01 10:15:52 crc kubenswrapper[5127]: I0201 10:15:52.236361 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:15:52 crc kubenswrapper[5127]: E0201 10:15:52.237207 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:15:52 crc kubenswrapper[5127]: I0201 10:15:52.362878 5127 scope.go:117] "RemoveContainer" containerID="c92eba0eda55bb189b4b464506e74717b39cf1c8baf0b57112dbc6446530f945" Feb 01 10:15:52 crc kubenswrapper[5127]: I0201 10:15:52.389656 5127 scope.go:117] "RemoveContainer" containerID="6c9a6ae1276cb9452092776526e39621155b93e615efbc432d204b5386f566b8" Feb 01 10:15:57 crc kubenswrapper[5127]: I0201 10:15:57.117552 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d7c98d66-r2hrl_64f4b3f8-dc3b-44c5-ab17-51cec08322b0/prometheus-operator-admission-webhook/0.log" Feb 01 10:15:57 crc kubenswrapper[5127]: I0201 10:15:57.125603 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-fl5c9_b9cff7c9-e3d3-41cb-8b79-76cca738c2f6/prometheus-operator/0.log" Feb 01 10:15:57 crc kubenswrapper[5127]: I0201 10:15:57.145002 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d7c98d66-8dlhm_5d9280e1-7d78-466f-a218-29bc52ab31d5/prometheus-operator-admission-webhook/0.log" Feb 01 10:15:57 crc kubenswrapper[5127]: I0201 10:15:57.294699 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-4c7dx_7e9485f5-0c8c-40cb-88de-fae715ae2f3f/perses-operator/0.log" Feb 01 10:15:57 crc kubenswrapper[5127]: I0201 10:15:57.338931 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-b4zck_c15e3a2f-7b85-439d-8fcc-9108c58e7a9e/operator/0.log" Feb 01 10:16:03 crc kubenswrapper[5127]: I0201 10:16:03.236115 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:16:03 crc kubenswrapper[5127]: E0201 10:16:03.237778 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:16:18 crc kubenswrapper[5127]: I0201 10:16:18.236243 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:16:18 crc kubenswrapper[5127]: E0201 10:16:18.237524 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:16:32 crc kubenswrapper[5127]: I0201 10:16:32.237297 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:16:32 crc kubenswrapper[5127]: E0201 10:16:32.238814 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.263442 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gclww"] Feb 01 10:16:42 crc kubenswrapper[5127]: E0201 10:16:42.264470 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1929df62-f872-439c-8fcf-4c39dfdf4228" containerName="collect-profiles" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.264482 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="1929df62-f872-439c-8fcf-4c39dfdf4228" containerName="collect-profiles" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.264711 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="1929df62-f872-439c-8fcf-4c39dfdf4228" containerName="collect-profiles" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.266224 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.274265 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gclww"] Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.373926 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-utilities\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.374000 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-catalog-content\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.374300 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t46z\" (UniqueName: \"kubernetes.io/projected/e493a326-98bf-4e1e-8c98-df231964abc2-kube-api-access-2t46z\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.477109 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-utilities\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.477177 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-catalog-content\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.477246 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t46z\" (UniqueName: \"kubernetes.io/projected/e493a326-98bf-4e1e-8c98-df231964abc2-kube-api-access-2t46z\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.477618 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-utilities\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.477750 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-catalog-content\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.495626 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t46z\" (UniqueName: \"kubernetes.io/projected/e493a326-98bf-4e1e-8c98-df231964abc2-kube-api-access-2t46z\") pod \"community-operators-gclww\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:42 crc kubenswrapper[5127]: I0201 10:16:42.599482 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:43 crc kubenswrapper[5127]: I0201 10:16:43.724234 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gclww"] Feb 01 10:16:43 crc kubenswrapper[5127]: W0201 10:16:43.742084 5127 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode493a326_98bf_4e1e_8c98_df231964abc2.slice/crio-c7ff49cfdf3a73abe1e048a7b1963e31b1120441a071f679bb2cc05f336f3c0a WatchSource:0}: Error finding container c7ff49cfdf3a73abe1e048a7b1963e31b1120441a071f679bb2cc05f336f3c0a: Status 404 returned error can't find the container with id c7ff49cfdf3a73abe1e048a7b1963e31b1120441a071f679bb2cc05f336f3c0a Feb 01 10:16:44 crc kubenswrapper[5127]: I0201 10:16:44.679484 5127 generic.go:334] "Generic (PLEG): container finished" podID="e493a326-98bf-4e1e-8c98-df231964abc2" containerID="67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57" exitCode=0 Feb 01 10:16:44 crc kubenswrapper[5127]: I0201 10:16:44.679978 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gclww" event={"ID":"e493a326-98bf-4e1e-8c98-df231964abc2","Type":"ContainerDied","Data":"67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57"} Feb 01 10:16:44 crc kubenswrapper[5127]: I0201 10:16:44.680021 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gclww" event={"ID":"e493a326-98bf-4e1e-8c98-df231964abc2","Type":"ContainerStarted","Data":"c7ff49cfdf3a73abe1e048a7b1963e31b1120441a071f679bb2cc05f336f3c0a"} Feb 01 10:16:44 crc kubenswrapper[5127]: I0201 10:16:44.683976 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 10:16:45 crc kubenswrapper[5127]: I0201 10:16:45.236455 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:16:45 crc kubenswrapper[5127]: E0201 10:16:45.237246 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:16:46 crc kubenswrapper[5127]: I0201 10:16:46.718192 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gclww" event={"ID":"e493a326-98bf-4e1e-8c98-df231964abc2","Type":"ContainerStarted","Data":"d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0"} Feb 01 10:16:47 crc kubenswrapper[5127]: I0201 10:16:47.730446 5127 generic.go:334] "Generic (PLEG): container finished" podID="e493a326-98bf-4e1e-8c98-df231964abc2" containerID="d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0" exitCode=0 Feb 01 10:16:47 crc kubenswrapper[5127]: I0201 10:16:47.730501 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gclww" event={"ID":"e493a326-98bf-4e1e-8c98-df231964abc2","Type":"ContainerDied","Data":"d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0"} Feb 01 10:16:48 crc kubenswrapper[5127]: I0201 10:16:48.754177 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gclww" event={"ID":"e493a326-98bf-4e1e-8c98-df231964abc2","Type":"ContainerStarted","Data":"17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373"} Feb 01 10:16:48 crc kubenswrapper[5127]: I0201 10:16:48.777546 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gclww" podStartSLOduration=3.34370458 podStartE2EDuration="6.777526479s" podCreationTimestamp="2026-02-01 10:16:42 +0000 UTC" firstStartedPulling="2026-02-01 10:16:44.682892787 +0000 UTC m=+12555.168795200" lastFinishedPulling="2026-02-01 10:16:48.116714726 +0000 UTC m=+12558.602617099" observedRunningTime="2026-02-01 10:16:48.768530937 +0000 UTC m=+12559.254433300" watchObservedRunningTime="2026-02-01 10:16:48.777526479 +0000 UTC m=+12559.263428862" Feb 01 10:16:52 crc kubenswrapper[5127]: I0201 10:16:52.600044 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:52 crc kubenswrapper[5127]: I0201 10:16:52.601194 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:52 crc kubenswrapper[5127]: I0201 10:16:52.653878 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:16:57 crc kubenswrapper[5127]: I0201 10:16:57.236206 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:16:57 crc kubenswrapper[5127]: E0201 10:16:57.238473 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:17:02 crc kubenswrapper[5127]: I0201 10:17:02.699224 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:17:02 crc kubenswrapper[5127]: I0201 10:17:02.768763 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gclww"] Feb 01 10:17:02 crc kubenswrapper[5127]: I0201 10:17:02.925274 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gclww" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="registry-server" containerID="cri-o://17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373" gracePeriod=2 Feb 01 10:17:03 crc kubenswrapper[5127]: E0201 10:17:03.076514 5127 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode493a326_98bf_4e1e_8c98_df231964abc2.slice/crio-conmon-17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373.scope\": RecentStats: unable to find data in memory cache]" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.515189 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.672361 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-utilities\") pod \"e493a326-98bf-4e1e-8c98-df231964abc2\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.672497 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t46z\" (UniqueName: \"kubernetes.io/projected/e493a326-98bf-4e1e-8c98-df231964abc2-kube-api-access-2t46z\") pod \"e493a326-98bf-4e1e-8c98-df231964abc2\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.672621 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-catalog-content\") pod \"e493a326-98bf-4e1e-8c98-df231964abc2\" (UID: \"e493a326-98bf-4e1e-8c98-df231964abc2\") " Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.678809 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-utilities" (OuterVolumeSpecName: "utilities") pod "e493a326-98bf-4e1e-8c98-df231964abc2" (UID: "e493a326-98bf-4e1e-8c98-df231964abc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.695127 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e493a326-98bf-4e1e-8c98-df231964abc2-kube-api-access-2t46z" (OuterVolumeSpecName: "kube-api-access-2t46z") pod "e493a326-98bf-4e1e-8c98-df231964abc2" (UID: "e493a326-98bf-4e1e-8c98-df231964abc2"). InnerVolumeSpecName "kube-api-access-2t46z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.737303 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e493a326-98bf-4e1e-8c98-df231964abc2" (UID: "e493a326-98bf-4e1e-8c98-df231964abc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.777095 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.777178 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t46z\" (UniqueName: \"kubernetes.io/projected/e493a326-98bf-4e1e-8c98-df231964abc2-kube-api-access-2t46z\") on node \"crc\" DevicePath \"\"" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.777201 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e493a326-98bf-4e1e-8c98-df231964abc2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.971281 5127 generic.go:334] "Generic (PLEG): container finished" podID="e493a326-98bf-4e1e-8c98-df231964abc2" containerID="17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373" exitCode=0 Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.971327 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gclww" event={"ID":"e493a326-98bf-4e1e-8c98-df231964abc2","Type":"ContainerDied","Data":"17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373"} Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.971357 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gclww" event={"ID":"e493a326-98bf-4e1e-8c98-df231964abc2","Type":"ContainerDied","Data":"c7ff49cfdf3a73abe1e048a7b1963e31b1120441a071f679bb2cc05f336f3c0a"} Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.971379 5127 scope.go:117] "RemoveContainer" containerID="17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373" Feb 01 10:17:03 crc kubenswrapper[5127]: I0201 10:17:03.971384 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gclww" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.017313 5127 scope.go:117] "RemoveContainer" containerID="d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.029154 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gclww"] Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.043019 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gclww"] Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.064347 5127 scope.go:117] "RemoveContainer" containerID="67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.112005 5127 scope.go:117] "RemoveContainer" containerID="17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373" Feb 01 10:17:04 crc kubenswrapper[5127]: E0201 10:17:04.112790 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373\": container with ID starting with 17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373 not found: ID does not exist" containerID="17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.112829 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373"} err="failed to get container status \"17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373\": rpc error: code = NotFound desc = could not find container \"17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373\": container with ID starting with 17bcf90a6ad1686ef33c2a0a98a4b89f25e91c8205a640bc0c3fd96528365373 not found: ID does not exist" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.112867 5127 scope.go:117] "RemoveContainer" containerID="d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0" Feb 01 10:17:04 crc kubenswrapper[5127]: E0201 10:17:04.113264 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0\": container with ID starting with d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0 not found: ID does not exist" containerID="d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.113294 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0"} err="failed to get container status \"d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0\": rpc error: code = NotFound desc = could not find container \"d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0\": container with ID starting with d69603dde9deed7a130e9ee1e9496eab67db674f02e4d962aa98a4757e52b5e0 not found: ID does not exist" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.113311 5127 scope.go:117] "RemoveContainer" containerID="67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57" Feb 01 10:17:04 crc kubenswrapper[5127]: E0201 10:17:04.113564 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57\": container with ID starting with 67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57 not found: ID does not exist" containerID="67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.113610 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57"} err="failed to get container status \"67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57\": rpc error: code = NotFound desc = could not find container \"67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57\": container with ID starting with 67d7b3774dd8ec8de3fb43eb4e32b5feb7944021cb65e4a00f4831fb76950f57 not found: ID does not exist" Feb 01 10:17:04 crc kubenswrapper[5127]: I0201 10:17:04.261860 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" path="/var/lib/kubelet/pods/e493a326-98bf-4e1e-8c98-df231964abc2/volumes" Feb 01 10:17:08 crc kubenswrapper[5127]: I0201 10:17:08.239798 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:17:08 crc kubenswrapper[5127]: E0201 10:17:08.241451 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:17:21 crc kubenswrapper[5127]: I0201 10:17:21.237450 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:17:21 crc kubenswrapper[5127]: E0201 10:17:21.244542 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:17:34 crc kubenswrapper[5127]: I0201 10:17:34.240232 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:17:34 crc kubenswrapper[5127]: E0201 10:17:34.241079 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:17:47 crc kubenswrapper[5127]: I0201 10:17:47.235537 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:17:47 crc kubenswrapper[5127]: E0201 10:17:47.236849 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:17:59 crc kubenswrapper[5127]: I0201 10:17:59.236725 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:17:59 crc kubenswrapper[5127]: E0201 10:17:59.237778 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:18:10 crc kubenswrapper[5127]: I0201 10:18:10.252101 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:18:10 crc kubenswrapper[5127]: E0201 10:18:10.253231 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:18:24 crc kubenswrapper[5127]: I0201 10:18:24.236745 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:18:24 crc kubenswrapper[5127]: E0201 10:18:24.237795 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:18:35 crc kubenswrapper[5127]: I0201 10:18:35.237128 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:18:35 crc kubenswrapper[5127]: E0201 10:18:35.238414 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:18:46 crc kubenswrapper[5127]: I0201 10:18:46.235824 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:18:46 crc kubenswrapper[5127]: E0201 10:18:46.236666 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:18:57 crc kubenswrapper[5127]: I0201 10:18:57.236832 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:18:57 crc kubenswrapper[5127]: E0201 10:18:57.240507 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:19:09 crc kubenswrapper[5127]: I0201 10:19:09.236219 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:19:09 crc kubenswrapper[5127]: E0201 10:19:09.237361 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:19:22 crc kubenswrapper[5127]: I0201 10:19:22.236336 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:19:22 crc kubenswrapper[5127]: E0201 10:19:22.237410 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:19:36 crc kubenswrapper[5127]: I0201 10:19:36.121163 5127 generic.go:334] "Generic (PLEG): container finished" podID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerID="bd5f91be99d2c3b3d4171299c5a90f0c59d4417d7d060480a10a77638904ea73" exitCode=0 Feb 01 10:19:36 crc kubenswrapper[5127]: I0201 10:19:36.121282 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lw44g/must-gather-664vp" event={"ID":"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520","Type":"ContainerDied","Data":"bd5f91be99d2c3b3d4171299c5a90f0c59d4417d7d060480a10a77638904ea73"} Feb 01 10:19:36 crc kubenswrapper[5127]: I0201 10:19:36.122018 5127 scope.go:117] "RemoveContainer" containerID="bd5f91be99d2c3b3d4171299c5a90f0c59d4417d7d060480a10a77638904ea73" Feb 01 10:19:36 crc kubenswrapper[5127]: I0201 10:19:36.236124 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:19:36 crc kubenswrapper[5127]: E0201 10:19:36.236741 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:19:36 crc kubenswrapper[5127]: I0201 10:19:36.727459 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lw44g_must-gather-664vp_5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520/gather/0.log" Feb 01 10:19:47 crc kubenswrapper[5127]: I0201 10:19:47.235073 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:19:47 crc kubenswrapper[5127]: E0201 10:19:47.235782 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:19:47 crc kubenswrapper[5127]: I0201 10:19:47.940132 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lw44g/must-gather-664vp"] Feb 01 10:19:47 crc kubenswrapper[5127]: I0201 10:19:47.940661 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lw44g/must-gather-664vp" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerName="copy" containerID="cri-o://20ec62c3eb2c70ccde5e98553374fba2b49e0e484bea28a21b78dd7515089749" gracePeriod=2 Feb 01 10:19:47 crc kubenswrapper[5127]: I0201 10:19:47.953017 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lw44g/must-gather-664vp"] Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.300451 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lw44g_must-gather-664vp_5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520/copy/0.log" Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.319113 5127 generic.go:334] "Generic (PLEG): container finished" podID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerID="20ec62c3eb2c70ccde5e98553374fba2b49e0e484bea28a21b78dd7515089749" exitCode=143 Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.532664 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lw44g_must-gather-664vp_5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520/copy/0.log" Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.533515 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.731793 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-must-gather-output\") pod \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.731956 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssq6s\" (UniqueName: \"kubernetes.io/projected/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-kube-api-access-ssq6s\") pod \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\" (UID: \"5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520\") " Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.757784 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-kube-api-access-ssq6s" (OuterVolumeSpecName: "kube-api-access-ssq6s") pod "5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" (UID: "5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520"). InnerVolumeSpecName "kube-api-access-ssq6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:19:48 crc kubenswrapper[5127]: I0201 10:19:48.836007 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssq6s\" (UniqueName: \"kubernetes.io/projected/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-kube-api-access-ssq6s\") on node \"crc\" DevicePath \"\"" Feb 01 10:19:49 crc kubenswrapper[5127]: I0201 10:19:49.005813 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" (UID: "5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:19:49 crc kubenswrapper[5127]: I0201 10:19:49.040296 5127 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 01 10:19:49 crc kubenswrapper[5127]: I0201 10:19:49.330261 5127 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lw44g_must-gather-664vp_5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520/copy/0.log" Feb 01 10:19:49 crc kubenswrapper[5127]: I0201 10:19:49.330675 5127 scope.go:117] "RemoveContainer" containerID="20ec62c3eb2c70ccde5e98553374fba2b49e0e484bea28a21b78dd7515089749" Feb 01 10:19:49 crc kubenswrapper[5127]: I0201 10:19:49.330748 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lw44g/must-gather-664vp" Feb 01 10:19:49 crc kubenswrapper[5127]: I0201 10:19:49.355491 5127 scope.go:117] "RemoveContainer" containerID="bd5f91be99d2c3b3d4171299c5a90f0c59d4417d7d060480a10a77638904ea73" Feb 01 10:19:50 crc kubenswrapper[5127]: I0201 10:19:50.252423 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" path="/var/lib/kubelet/pods/5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520/volumes" Feb 01 10:19:59 crc kubenswrapper[5127]: I0201 10:19:59.235686 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:19:59 crc kubenswrapper[5127]: E0201 10:19:59.236543 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:20:12 crc kubenswrapper[5127]: I0201 10:20:12.236696 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:20:12 crc kubenswrapper[5127]: E0201 10:20:12.237829 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:20:25 crc kubenswrapper[5127]: I0201 10:20:25.236066 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:20:25 crc kubenswrapper[5127]: E0201 10:20:25.237030 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:20:36 crc kubenswrapper[5127]: I0201 10:20:36.235841 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:20:36 crc kubenswrapper[5127]: E0201 10:20:36.236661 5127 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-s2frk_openshift-machine-config-operator(874ffcf5-fe2e-4225-a2a1-38f900cbffaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" Feb 01 10:20:48 crc kubenswrapper[5127]: I0201 10:20:48.237796 5127 scope.go:117] "RemoveContainer" containerID="8208993972bbe019f845649658120cbdb17b1d7ae5b7ac6ee0c5eab3729c88df" Feb 01 10:20:49 crc kubenswrapper[5127]: I0201 10:20:49.127086 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" event={"ID":"874ffcf5-fe2e-4225-a2a1-38f900cbffaf","Type":"ContainerStarted","Data":"7f33380bc529c0eb8291b1b68a139a40b2e8dc35262998b233ee8dd5b3ae3488"} Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.333126 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rwwph"] Feb 01 10:22:03 crc kubenswrapper[5127]: E0201 10:22:03.334523 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerName="gather" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.334549 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerName="gather" Feb 01 10:22:03 crc kubenswrapper[5127]: E0201 10:22:03.334574 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerName="copy" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.334619 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerName="copy" Feb 01 10:22:03 crc kubenswrapper[5127]: E0201 10:22:03.334659 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="extract-content" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.334673 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="extract-content" Feb 01 10:22:03 crc kubenswrapper[5127]: E0201 10:22:03.334719 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="registry-server" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.334737 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="registry-server" Feb 01 10:22:03 crc kubenswrapper[5127]: E0201 10:22:03.334807 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="extract-utilities" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.334828 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="extract-utilities" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.335255 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerName="copy" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.335288 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="e493a326-98bf-4e1e-8c98-df231964abc2" containerName="registry-server" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.335318 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d58e1dc-8b1a-4e9e-9acf-4d56bbb0c520" containerName="gather" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.339658 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.351075 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rwwph"] Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.431748 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-catalog-content\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.432043 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-utilities\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.432283 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqbnm\" (UniqueName: \"kubernetes.io/projected/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-kube-api-access-pqbnm\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.533965 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-utilities\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.534043 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqbnm\" (UniqueName: \"kubernetes.io/projected/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-kube-api-access-pqbnm\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.534120 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-catalog-content\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.534709 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-utilities\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.534735 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-catalog-content\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.555793 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqbnm\" (UniqueName: \"kubernetes.io/projected/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-kube-api-access-pqbnm\") pod \"certified-operators-rwwph\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:03 crc kubenswrapper[5127]: I0201 10:22:03.679878 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:04 crc kubenswrapper[5127]: I0201 10:22:04.179979 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rwwph"] Feb 01 10:22:05 crc kubenswrapper[5127]: I0201 10:22:05.142853 5127 generic.go:334] "Generic (PLEG): container finished" podID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerID="77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300" exitCode=0 Feb 01 10:22:05 crc kubenswrapper[5127]: I0201 10:22:05.142978 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwwph" event={"ID":"8806d78f-9f7e-4cb7-8b76-3413a95c1b30","Type":"ContainerDied","Data":"77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300"} Feb 01 10:22:05 crc kubenswrapper[5127]: I0201 10:22:05.143313 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwwph" event={"ID":"8806d78f-9f7e-4cb7-8b76-3413a95c1b30","Type":"ContainerStarted","Data":"f8d32145e09f233880c3f46b44a7d5a58448c370527f82a45af3bdc5dce56d5e"} Feb 01 10:22:05 crc kubenswrapper[5127]: I0201 10:22:05.146713 5127 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 10:22:06 crc kubenswrapper[5127]: I0201 10:22:06.157392 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwwph" event={"ID":"8806d78f-9f7e-4cb7-8b76-3413a95c1b30","Type":"ContainerStarted","Data":"39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa"} Feb 01 10:22:07 crc kubenswrapper[5127]: I0201 10:22:07.168956 5127 generic.go:334] "Generic (PLEG): container finished" podID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerID="39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa" exitCode=0 Feb 01 10:22:07 crc kubenswrapper[5127]: I0201 10:22:07.169049 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwwph" event={"ID":"8806d78f-9f7e-4cb7-8b76-3413a95c1b30","Type":"ContainerDied","Data":"39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa"} Feb 01 10:22:08 crc kubenswrapper[5127]: I0201 10:22:08.202620 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwwph" event={"ID":"8806d78f-9f7e-4cb7-8b76-3413a95c1b30","Type":"ContainerStarted","Data":"39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d"} Feb 01 10:22:08 crc kubenswrapper[5127]: I0201 10:22:08.223062 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rwwph" podStartSLOduration=2.766310517 podStartE2EDuration="5.223038722s" podCreationTimestamp="2026-02-01 10:22:03 +0000 UTC" firstStartedPulling="2026-02-01 10:22:05.146398543 +0000 UTC m=+12875.632300916" lastFinishedPulling="2026-02-01 10:22:07.603126758 +0000 UTC m=+12878.089029121" observedRunningTime="2026-02-01 10:22:08.21965937 +0000 UTC m=+12878.705561743" watchObservedRunningTime="2026-02-01 10:22:08.223038722 +0000 UTC m=+12878.708941095" Feb 01 10:22:13 crc kubenswrapper[5127]: I0201 10:22:13.680743 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:13 crc kubenswrapper[5127]: I0201 10:22:13.681367 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:13 crc kubenswrapper[5127]: I0201 10:22:13.748419 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:14 crc kubenswrapper[5127]: I0201 10:22:14.369550 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:14 crc kubenswrapper[5127]: I0201 10:22:14.439829 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rwwph"] Feb 01 10:22:16 crc kubenswrapper[5127]: I0201 10:22:16.305927 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rwwph" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="registry-server" containerID="cri-o://39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d" gracePeriod=2 Feb 01 10:22:16 crc kubenswrapper[5127]: I0201 10:22:16.913447 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.063961 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqbnm\" (UniqueName: \"kubernetes.io/projected/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-kube-api-access-pqbnm\") pod \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.064195 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-catalog-content\") pod \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.064265 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-utilities\") pod \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\" (UID: \"8806d78f-9f7e-4cb7-8b76-3413a95c1b30\") " Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.065179 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-utilities" (OuterVolumeSpecName: "utilities") pod "8806d78f-9f7e-4cb7-8b76-3413a95c1b30" (UID: "8806d78f-9f7e-4cb7-8b76-3413a95c1b30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.074173 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-kube-api-access-pqbnm" (OuterVolumeSpecName: "kube-api-access-pqbnm") pod "8806d78f-9f7e-4cb7-8b76-3413a95c1b30" (UID: "8806d78f-9f7e-4cb7-8b76-3413a95c1b30"). InnerVolumeSpecName "kube-api-access-pqbnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.166283 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqbnm\" (UniqueName: \"kubernetes.io/projected/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-kube-api-access-pqbnm\") on node \"crc\" DevicePath \"\"" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.166320 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.327488 5127 generic.go:334] "Generic (PLEG): container finished" podID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerID="39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d" exitCode=0 Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.327549 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwwph" event={"ID":"8806d78f-9f7e-4cb7-8b76-3413a95c1b30","Type":"ContainerDied","Data":"39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d"} Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.327666 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rwwph" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.327709 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rwwph" event={"ID":"8806d78f-9f7e-4cb7-8b76-3413a95c1b30","Type":"ContainerDied","Data":"f8d32145e09f233880c3f46b44a7d5a58448c370527f82a45af3bdc5dce56d5e"} Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.327741 5127 scope.go:117] "RemoveContainer" containerID="39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.347720 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8806d78f-9f7e-4cb7-8b76-3413a95c1b30" (UID: "8806d78f-9f7e-4cb7-8b76-3413a95c1b30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.354325 5127 scope.go:117] "RemoveContainer" containerID="39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.370020 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8806d78f-9f7e-4cb7-8b76-3413a95c1b30-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.387968 5127 scope.go:117] "RemoveContainer" containerID="77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.463029 5127 scope.go:117] "RemoveContainer" containerID="39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d" Feb 01 10:22:17 crc kubenswrapper[5127]: E0201 10:22:17.464130 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d\": container with ID starting with 39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d not found: ID does not exist" containerID="39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.464209 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d"} err="failed to get container status \"39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d\": rpc error: code = NotFound desc = could not find container \"39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d\": container with ID starting with 39ca0177356ff7ac981098f47e6cc196571cace19e4d99c3774edf9d29a4013d not found: ID does not exist" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.464247 5127 scope.go:117] "RemoveContainer" containerID="39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa" Feb 01 10:22:17 crc kubenswrapper[5127]: E0201 10:22:17.464678 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa\": container with ID starting with 39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa not found: ID does not exist" containerID="39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.464720 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa"} err="failed to get container status \"39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa\": rpc error: code = NotFound desc = could not find container \"39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa\": container with ID starting with 39756f3726ed8f2fe8db0fba439063c749cac056ebab0f43da0447a0bae3e9fa not found: ID does not exist" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.464752 5127 scope.go:117] "RemoveContainer" containerID="77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300" Feb 01 10:22:17 crc kubenswrapper[5127]: E0201 10:22:17.465067 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300\": container with ID starting with 77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300 not found: ID does not exist" containerID="77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.465102 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300"} err="failed to get container status \"77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300\": rpc error: code = NotFound desc = could not find container \"77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300\": container with ID starting with 77e398e4b2aa5d2a9f7c50516d44bdaaa55453033ff808ea43c973cc6555f300 not found: ID does not exist" Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.682985 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rwwph"] Feb 01 10:22:17 crc kubenswrapper[5127]: I0201 10:22:17.692249 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rwwph"] Feb 01 10:22:18 crc kubenswrapper[5127]: I0201 10:22:18.246942 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" path="/var/lib/kubelet/pods/8806d78f-9f7e-4cb7-8b76-3413a95c1b30/volumes" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.645980 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfqh"] Feb 01 10:22:41 crc kubenswrapper[5127]: E0201 10:22:41.649309 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="registry-server" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.649340 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="registry-server" Feb 01 10:22:41 crc kubenswrapper[5127]: E0201 10:22:41.649392 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="extract-utilities" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.649402 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="extract-utilities" Feb 01 10:22:41 crc kubenswrapper[5127]: E0201 10:22:41.649417 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="extract-content" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.649425 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="extract-content" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.649746 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="8806d78f-9f7e-4cb7-8b76-3413a95c1b30" containerName="registry-server" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.651493 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.678068 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfqh"] Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.776139 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r4cw\" (UniqueName: \"kubernetes.io/projected/26250024-5a9f-43ac-9789-5b23baa8f4ce-kube-api-access-5r4cw\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.776646 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-utilities\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.776944 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-catalog-content\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.879133 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-utilities\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.879255 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-catalog-content\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.879296 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4cw\" (UniqueName: \"kubernetes.io/projected/26250024-5a9f-43ac-9789-5b23baa8f4ce-kube-api-access-5r4cw\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.880039 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-utilities\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.880249 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-catalog-content\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:41 crc kubenswrapper[5127]: I0201 10:22:41.898192 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r4cw\" (UniqueName: \"kubernetes.io/projected/26250024-5a9f-43ac-9789-5b23baa8f4ce-kube-api-access-5r4cw\") pod \"redhat-marketplace-4nfqh\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:42 crc kubenswrapper[5127]: I0201 10:22:42.014002 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:42 crc kubenswrapper[5127]: I0201 10:22:42.499815 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfqh"] Feb 01 10:22:42 crc kubenswrapper[5127]: I0201 10:22:42.715313 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerStarted","Data":"d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83"} Feb 01 10:22:42 crc kubenswrapper[5127]: I0201 10:22:42.715878 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerStarted","Data":"711aaca94cbde8d3eac144cc8acd7b28e96f344d78b3bbd79a0f06f86371014c"} Feb 01 10:22:43 crc kubenswrapper[5127]: I0201 10:22:43.732043 5127 generic.go:334] "Generic (PLEG): container finished" podID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerID="d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83" exitCode=0 Feb 01 10:22:43 crc kubenswrapper[5127]: I0201 10:22:43.732147 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerDied","Data":"d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83"} Feb 01 10:22:43 crc kubenswrapper[5127]: I0201 10:22:43.732433 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerStarted","Data":"cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0"} Feb 01 10:22:44 crc kubenswrapper[5127]: I0201 10:22:44.753778 5127 generic.go:334] "Generic (PLEG): container finished" podID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerID="cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0" exitCode=0 Feb 01 10:22:44 crc kubenswrapper[5127]: I0201 10:22:44.753898 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerDied","Data":"cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0"} Feb 01 10:22:45 crc kubenswrapper[5127]: I0201 10:22:45.768358 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerStarted","Data":"ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131"} Feb 01 10:22:45 crc kubenswrapper[5127]: I0201 10:22:45.799869 5127 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4nfqh" podStartSLOduration=2.367656289 podStartE2EDuration="4.799849794s" podCreationTimestamp="2026-02-01 10:22:41 +0000 UTC" firstStartedPulling="2026-02-01 10:22:42.719095986 +0000 UTC m=+12913.204998359" lastFinishedPulling="2026-02-01 10:22:45.151289471 +0000 UTC m=+12915.637191864" observedRunningTime="2026-02-01 10:22:45.799276679 +0000 UTC m=+12916.285179052" watchObservedRunningTime="2026-02-01 10:22:45.799849794 +0000 UTC m=+12916.285752157" Feb 01 10:22:52 crc kubenswrapper[5127]: I0201 10:22:52.014285 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:52 crc kubenswrapper[5127]: I0201 10:22:52.015003 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:52 crc kubenswrapper[5127]: I0201 10:22:52.099161 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:52 crc kubenswrapper[5127]: I0201 10:22:52.923614 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:52 crc kubenswrapper[5127]: I0201 10:22:52.970037 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfqh"] Feb 01 10:22:54 crc kubenswrapper[5127]: I0201 10:22:54.883233 5127 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4nfqh" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="registry-server" containerID="cri-o://ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131" gracePeriod=2 Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.383348 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.495231 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-catalog-content\") pod \"26250024-5a9f-43ac-9789-5b23baa8f4ce\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.495811 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-utilities\") pod \"26250024-5a9f-43ac-9789-5b23baa8f4ce\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.495917 5127 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r4cw\" (UniqueName: \"kubernetes.io/projected/26250024-5a9f-43ac-9789-5b23baa8f4ce-kube-api-access-5r4cw\") pod \"26250024-5a9f-43ac-9789-5b23baa8f4ce\" (UID: \"26250024-5a9f-43ac-9789-5b23baa8f4ce\") " Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.496728 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-utilities" (OuterVolumeSpecName: "utilities") pod "26250024-5a9f-43ac-9789-5b23baa8f4ce" (UID: "26250024-5a9f-43ac-9789-5b23baa8f4ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.497203 5127 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.508305 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26250024-5a9f-43ac-9789-5b23baa8f4ce-kube-api-access-5r4cw" (OuterVolumeSpecName: "kube-api-access-5r4cw") pod "26250024-5a9f-43ac-9789-5b23baa8f4ce" (UID: "26250024-5a9f-43ac-9789-5b23baa8f4ce"). InnerVolumeSpecName "kube-api-access-5r4cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.520368 5127 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26250024-5a9f-43ac-9789-5b23baa8f4ce" (UID: "26250024-5a9f-43ac-9789-5b23baa8f4ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.599936 5127 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26250024-5a9f-43ac-9789-5b23baa8f4ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.600008 5127 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r4cw\" (UniqueName: \"kubernetes.io/projected/26250024-5a9f-43ac-9789-5b23baa8f4ce-kube-api-access-5r4cw\") on node \"crc\" DevicePath \"\"" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.899384 5127 generic.go:334] "Generic (PLEG): container finished" podID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerID="ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131" exitCode=0 Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.899426 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerDied","Data":"ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131"} Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.899451 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nfqh" event={"ID":"26250024-5a9f-43ac-9789-5b23baa8f4ce","Type":"ContainerDied","Data":"711aaca94cbde8d3eac144cc8acd7b28e96f344d78b3bbd79a0f06f86371014c"} Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.899469 5127 scope.go:117] "RemoveContainer" containerID="ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.899612 5127 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nfqh" Feb 01 10:22:55 crc kubenswrapper[5127]: I0201 10:22:55.930034 5127 scope.go:117] "RemoveContainer" containerID="cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:55.991965 5127 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfqh"] Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.008677 5127 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nfqh"] Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:55.997121 5127 scope.go:117] "RemoveContainer" containerID="d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.040245 5127 scope.go:117] "RemoveContainer" containerID="ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131" Feb 01 10:22:56 crc kubenswrapper[5127]: E0201 10:22:56.040892 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131\": container with ID starting with ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131 not found: ID does not exist" containerID="ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.040925 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131"} err="failed to get container status \"ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131\": rpc error: code = NotFound desc = could not find container \"ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131\": container with ID starting with ef8ec6c367523af49b6e8dd3e10a34a63796534a14efabb2770a519609542131 not found: ID does not exist" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.040947 5127 scope.go:117] "RemoveContainer" containerID="cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0" Feb 01 10:22:56 crc kubenswrapper[5127]: E0201 10:22:56.041357 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0\": container with ID starting with cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0 not found: ID does not exist" containerID="cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.041374 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0"} err="failed to get container status \"cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0\": rpc error: code = NotFound desc = could not find container \"cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0\": container with ID starting with cc7c33a2a92c6bf55c7a335343c6353162d6678cd7556f494245626a9a8f21b0 not found: ID does not exist" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.041385 5127 scope.go:117] "RemoveContainer" containerID="d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83" Feb 01 10:22:56 crc kubenswrapper[5127]: E0201 10:22:56.042306 5127 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83\": container with ID starting with d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83 not found: ID does not exist" containerID="d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.042356 5127 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83"} err="failed to get container status \"d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83\": rpc error: code = NotFound desc = could not find container \"d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83\": container with ID starting with d024f14b5a64e7cb710e653f18fc7898c4a562938213e297c92f92bd26136f83 not found: ID does not exist" Feb 01 10:22:56 crc kubenswrapper[5127]: I0201 10:22:56.254894 5127 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" path="/var/lib/kubelet/pods/26250024-5a9f-43ac-9789-5b23baa8f4ce/volumes" Feb 01 10:23:06 crc kubenswrapper[5127]: I0201 10:23:06.741325 5127 patch_prober.go:28] interesting pod/machine-config-daemon-s2frk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 10:23:06 crc kubenswrapper[5127]: I0201 10:23:06.742155 5127 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-s2frk" podUID="874ffcf5-fe2e-4225-a2a1-38f900cbffaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.564808 5127 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9j982"] Feb 01 10:23:12 crc kubenswrapper[5127]: E0201 10:23:12.565801 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="extract-utilities" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.565818 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="extract-utilities" Feb 01 10:23:12 crc kubenswrapper[5127]: E0201 10:23:12.565873 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="registry-server" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.565882 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="registry-server" Feb 01 10:23:12 crc kubenswrapper[5127]: E0201 10:23:12.565899 5127 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="extract-content" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.565907 5127 state_mem.go:107] "Deleted CPUSet assignment" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="extract-content" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.566139 5127 memory_manager.go:354] "RemoveStaleState removing state" podUID="26250024-5a9f-43ac-9789-5b23baa8f4ce" containerName="registry-server" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.567946 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.583482 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9j982"] Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.619779 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc6g6\" (UniqueName: \"kubernetes.io/projected/86ba6844-1b9e-47fb-93ab-0a58492a8eae-kube-api-access-qc6g6\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.619856 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba6844-1b9e-47fb-93ab-0a58492a8eae-utilities\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.619907 5127 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba6844-1b9e-47fb-93ab-0a58492a8eae-catalog-content\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.722524 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc6g6\" (UniqueName: \"kubernetes.io/projected/86ba6844-1b9e-47fb-93ab-0a58492a8eae-kube-api-access-qc6g6\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.722603 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba6844-1b9e-47fb-93ab-0a58492a8eae-utilities\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.722643 5127 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba6844-1b9e-47fb-93ab-0a58492a8eae-catalog-content\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.723259 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba6844-1b9e-47fb-93ab-0a58492a8eae-catalog-content\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.723321 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba6844-1b9e-47fb-93ab-0a58492a8eae-utilities\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.745653 5127 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc6g6\" (UniqueName: \"kubernetes.io/projected/86ba6844-1b9e-47fb-93ab-0a58492a8eae-kube-api-access-qc6g6\") pod \"redhat-operators-9j982\" (UID: \"86ba6844-1b9e-47fb-93ab-0a58492a8eae\") " pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:12 crc kubenswrapper[5127]: I0201 10:23:12.889289 5127 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:13 crc kubenswrapper[5127]: I0201 10:23:13.421047 5127 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9j982"] Feb 01 10:23:14 crc kubenswrapper[5127]: I0201 10:23:14.152632 5127 generic.go:334] "Generic (PLEG): container finished" podID="86ba6844-1b9e-47fb-93ab-0a58492a8eae" containerID="5ae20fe67cb316dcbab04f98ace5cdfe6fab0b5e7b9b3aecaa5cdd8a6499e76c" exitCode=0 Feb 01 10:23:14 crc kubenswrapper[5127]: I0201 10:23:14.152693 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j982" event={"ID":"86ba6844-1b9e-47fb-93ab-0a58492a8eae","Type":"ContainerDied","Data":"5ae20fe67cb316dcbab04f98ace5cdfe6fab0b5e7b9b3aecaa5cdd8a6499e76c"} Feb 01 10:23:14 crc kubenswrapper[5127]: I0201 10:23:14.153341 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j982" event={"ID":"86ba6844-1b9e-47fb-93ab-0a58492a8eae","Type":"ContainerStarted","Data":"5d576b73eb420642b75f35f1d106c6fd32bf42d118b777d645a3f6aae8759841"} Feb 01 10:23:15 crc kubenswrapper[5127]: I0201 10:23:15.188397 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j982" event={"ID":"86ba6844-1b9e-47fb-93ab-0a58492a8eae","Type":"ContainerStarted","Data":"cecfec592bdd7f744ffd61eaa3293dc963402ece6e488750a87d4ae9350fa2dc"} Feb 01 10:23:20 crc kubenswrapper[5127]: I0201 10:23:20.269225 5127 generic.go:334] "Generic (PLEG): container finished" podID="86ba6844-1b9e-47fb-93ab-0a58492a8eae" containerID="cecfec592bdd7f744ffd61eaa3293dc963402ece6e488750a87d4ae9350fa2dc" exitCode=0 Feb 01 10:23:20 crc kubenswrapper[5127]: I0201 10:23:20.269288 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j982" event={"ID":"86ba6844-1b9e-47fb-93ab-0a58492a8eae","Type":"ContainerDied","Data":"cecfec592bdd7f744ffd61eaa3293dc963402ece6e488750a87d4ae9350fa2dc"} Feb 01 10:23:21 crc kubenswrapper[5127]: I0201 10:23:21.293048 5127 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9j982" event={"ID":"86ba6844-1b9e-47fb-93ab-0a58492a8eae","Type":"ContainerStarted","Data":"391cbf7b80121f021908baad759d6081f2ba1152058e497123bbacc7c9b15c0a"} Feb 01 10:23:22 crc kubenswrapper[5127]: I0201 10:23:22.890035 5127 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:22 crc kubenswrapper[5127]: I0201 10:23:22.890401 5127 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9j982" Feb 01 10:23:23 crc kubenswrapper[5127]: I0201 10:23:23.937452 5127 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9j982" podUID="86ba6844-1b9e-47fb-93ab-0a58492a8eae" containerName="registry-server" probeResult="failure" output=< Feb 01 10:23:23 crc kubenswrapper[5127]: timeout: failed to connect service ":50051" within 1s Feb 01 10:23:23 crc kubenswrapper[5127]: >